BERT Fine-tuning For Arabic Text Summarization
Fine-tuning a pretrained BERT model is the state of the art method for
extractive/abstractive text summarization, in this paper we showcase how this
fine-tuning method can be applied to the Arabic language to both construct the
first documented model for abstractive Arabic text summarization and show its
performance in Arabic extractive summarization. Our model works with
multilingual BERT (as Arabic language does not have a pretrained BERT of its
own). We show its performance in English corpus first before applying it to
Arabic corpora in both extractive and abstractive tasks.
Link