D 2021

Lex Rosetta: Transfer of Predictive Models Across Languages, Jurisdictions, and Legal Domains

ŠAVELKA, Jaromír, Hannes WESTERMANN, Karim BENYEKHLEF, Charlotte S. ALEXANDER, Jayla C. GRANT et. al.

Basic information

Original name

Lex Rosetta: Transfer of Predictive Models Across Languages, Jurisdictions, and Legal Domains

Authors

ŠAVELKA, Jaromír (203 Czech Republic, guarantor), Hannes WESTERMANN, Karim BENYEKHLEF, Charlotte S. ALEXANDER, Jayla C. GRANT, Restrepo David AMARILES, Rajaa El HAMDANI, Sébastien MEEÙS, Aurore TROUSSEL, Michal ARASZKIEWICZ, Kevin D. ASHLEY (840 United States of America), Alexandra ASHLEY, Karl BRANTING, Mattia FALDUTI, Matthias GRABMAIR, Jakub HARAŠTA (203 Czech Republic, belonging to the institution), Tereza NOVOTNÁ (203 Czech Republic, belonging to the institution), Elizabeth TIPPETT and Shiwanni JOHNSON

Edition

1. vyd. New York, Eighteenth International Conference on Artificial Intelligence and Law: Proceedings of the Conference, p. 129-138, 10 pp. 2021

Publisher

ACM

Other information

Language

English

Type of outcome

Stať ve sborníku

Field of Study

50501 Law

Country of publisher

United States of America

Confidentiality degree

není předmětem státního či obchodního tajemství

Publication form

electronic version available online

RIV identification code

RIV/00216224:14220/21:00121820

Organization unit

Faculty of Law

ISBN

978-1-4503-8526-8

Keywords in English

multi-lingual sentence embeddings; transfer learning; domain adaptation; adjudicatory decisions; document segmentation; annotation

Tags

Tags

International impact, Reviewed
Změněno: 14/4/2022 16:25, Mgr. Petra Georgala

Abstract

V originále

In this paper, we examine the use of multi-lingual sentence embeddings to transfer predictive models for functional segmentation of adjudicatory decisions across jurisdictions, legal systems (common and civil law), languages, and domains (i.e. contexts). Mechanisms for utilizing linguistic resources outside of their original context have significant potential benefits in AI & Law because differences between legal systems, languages, or traditions often block wider adoption of research outcomes. We analyze the use of LanguageAgnostic Sentence Representations in sequence labeling models using Gated Recurrent Units (GRUs) that are transferable across languages. To investigate transfer between different contexts we developed an annotation scheme for functional segmentation of adjudicatory decisions. We found that models generalize beyond the contexts on which they were trained (e.g., a model trained on administrative decisions from the US can be applied to criminal law decisions from Italy). Further, we found that training the models on multiple contexts increases robustness and improves overall performance when evaluating on previously unseen contexts. Finally, we found that pooling the training data from all the contexts enhances the models’ in-context performance.

Links

CZ.02.2.69/0.0/0.0/19_073/0016943, interní kód MU
(CEP code: EF19_073/0016943)
Name: Interní grantová agentura Masarykovy univerzity (Acronym: IGA MU)
Investor: Ministry of Education, Youth and Sports of the CR, Priority axis 2: Development of universities and human resources for research and development

Files attached