a! – Automated Language Instruction


Learning new languages became very popular in recent years. Although many language learning platforms offer free-of-charge lessons for many languages, their exercises are often not challenging enough, as they are either too easy or too difficult for effective learning. We tackle this issue by adjusting the difficulty of C-test exercises to meet the demands of the learners. C-tests are a special kind of cloze test, in which learners have to complete the second half of every second word. They are frequently used as language proficiency tests, as they allow a learner to train morphological, syntactic, and semantic properties of a language at the same time.

Our goal is to automatically generate C-tests that match a learner's current proficiency. We therefore predict the CEFR level (Common European Framework of Reference for Languages) of C-tests and the difficulty of every gap. We then match the C-tests with the learner's goals and learning progress. Several factors play a role in C-test difficulty assessment and generation, including readability assessment of texts, word and spelling complexity, and the learner's answers in previous C-tests. We do not limit ourselves to a single language, but explore cross-lingual approaches towards predicting and manipulating exercise difficulty.


Our goals in this project are:

  • Automatic readability assessment of texts and exercises
  • Automatic difficulty prediction and manipulation of C-tests towards a specific CEFR level



This project is established in cooperation with L-Pub GmbH.


The project is funded by Hessen Agentur as part of the program Hessen ModellProjekte.


Lee, Ji-Ung ; Meyer, Christian M. ; Gurevych, Iryna (2018):
Avoid playing learner and system off against each other.
In: Abstracts of the Joint Meeting of WG3 & WG5 "Motivational, ethical and legal issues in crowdsourcing" of the European Network for Combining Language Learning with Crowdsourcing Techniques, In: enetCollect - Joint WG3 & WG5 Meeting, Leiden, Netherlands, 24-25 October 2018, [Online-Edition: https://fileserver.ukp.informatik.tu-darmstadt.de/UKP_Webpag...],

go to TU-biblio search on ULB website