Contrast is all you need

Burak Kilic, Floris Bex, Albert Gatt

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

In this study, we analyze data-scarce classification scenarios, where available labeled legal data is small and imbalanced,potentially hurting the quality of the results. We focused on two finetuning objectives; SetFit (Sentence TransformerFinetuning), a contrastive learning setup, and a vanilla finetuning setup on a legal provision classification task. Additionally,we compare the features that are extracted with LIME (Local Interpretable Model-agnostic Explanations) to see whichparticular features contributed to the model’s classification decisions. The results show that a contrastive setup with SetFitperformed better than vanilla finetuning while using a fraction of the training samples. LIME results show that the contrastivelearning approach helps boost both positive and negative features which are legally informative and contribute to theclassification results. Thus a model finetuned with a contrastive objective seems to base its decisions more confidently onlegally informative features.
Original languageEnglish
Title of host publicationASAIL 2023 - Automated Semantic Analysis of Information in Legal Text
Subtitle of host publicationProceedings of the 6th Workshop on Automated Semantic Analysis of Information in Legal Text
EditorsFrancesca Lagioia, Jack Mumford, Daphne Odekerken, Hannes Westermann
PublisherCEUR
Pages72-82
Number of pages11
Publication statusPublished - 2023

Publication series

NameCEUR Workshop Proceedings
PublisherCEUR
Volume3441
ISSN (Print)1613-0073

Keywords

  • LegalNLP
  • Contrastive Learning
  • NLP
  • Explainable AI

Fingerprint

Dive into the research topics of 'Contrast is all you need'. Together they form a unique fingerprint.

Cite this