Linguist vs. Machine: Rapid Development of Finite-State Morphological Grammars

Date

2020

Authors

Beemer, Sarah
Boston, Zak
Bukoski, April
Chen, Daniel
Dickens, Princess
Gerlach, Andrew
Hopkins, Torin
Jawale, Parth Anand
Koski, Chris
Malhotra, Akanksha

Journal Title

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics

Abstract

Sequence-to-sequence models have proven to be highly successful in learning morphological inflection from examples as the series of SIGMORPHON/CoNLL shared tasks have shown. It is usually assumed, however, that a linguist working with inflectional examples could in principle develop a gold standard-level morphological analyzer and generator that would surpass a trained neural network model in accuracy of predictions, but that it may require significant amounts of human labor. In this paper, we discuss an experiment where a group of people with some linguistic training develop 25+ grammars as part of the shared task and weigh the cost/benefit ratio of developing grammars by hand. We also present tools that can help linguists triage difficult complex morphophonological phenomena within a language and hypothesize inflectional class membership. We conclude that a significant development effort by trained linguists to analyze and model morphophonological patterns are required in order to surpass the accuracy of neural models.

Description

Keywords

Citation

Source

Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology

Type

Conference paper

Book Title

Entity type

Access Statement

Open Access

License Rights

Creative Commons Attribution 4.0 International License

Restricted until

Downloads