Sized Fill-in-the-blank or Multi Mask filling with RoBERTa and Huggingface Transformers
Sized fill-in-the-blank or conditional text filling is the idea of filling missing words of a sentence with the most probable choice of words.
Most of the examples available online showed the prediction of a single masked word. In this short tutorial we will see how we can use NLP language models (RoBERTa) to do conditional text filling.
The input to our program will be a sentence like this with blanks that need to be filled -
Tom has fully ___ ___ ___ illness.
The output will be the best guess for the fill-in-the-blanks -
recovered from his
Let’s start with the installation of the transformers library:
pip install transformers==2.10.0
The main code is -
The output from the above code is :
Original Sentence: Tom has fully ___ ___ ___ illness.
Original Sentence replaced with mask: Tom has fully <mask> <mask> <mask> illness.
Mask 1 Guesses : ['recovered', 'returned', 'recover', 'healed', 'cleared']
Mask 2 Guesses : ['from', 'his', 'with', 'to', 'the']
Mask 3 Guesses : ['his', 'the', 'her', 'mental', 'this']
Best guess for fill in the blank ::: recovered from his
Question Generation using NLP — A course
I launched a very interesting Udemy course titled “Question generation using NLP” expanding on some of the techniques discussed in this blog post. If you would like to take a look at it, here is the link.