Question answering is a very popular task in Natural language processing but question generation is novel and hasn’t been explored much yet.
If you want to try a live demo of question generation in action, please visit https://questgen.ai/
Question generation has a lot of use cases with the most prominent one being the ability to generate quick assessments from any given content. It would help school teachers in generating worksheets from any given chapter quickly and decrease their work burden during Covid-19.
You might have seen the traditional word2vec or Glove word embeddings examples that show King -Man+Woman = Queen. Here Queen will be returned from the word embedding algorithm given the words King, Man, and Woman. Today we will see how we can use this structure to solve a real-world problem.
In semantic search, we search a database of documents for a given user query and get back a set of relevant documents.
Whether we are building a recommendation system or question answering system often times we get back duplicate search results that are not exactly identical to be called duplicates but are near-duplicates that we can merge.
Imagine shopping for shoes on an e-commerce store and you get back 10 search results for shoes and out of them half are near-duplicates of each other with just a change in the viewing angle. We want the ability to merge these near-duplicates…
This year I launched my first ever online course on Udemy titled “Question generation using Natural Language processing” and made $3333 dollars in 5 months with 0 marketing spend and 100% paid enrollments.
Distractors are the wrong answers in a multiple-choice question.
For example, if a given multiple choice question has the game Cricket as the correct answer then we need to generate wrong choices (distractors) like Football, Golf, Ultimate Frisbee, etc.
I’m sure most of you have heard about OpenAI’s GPT-3 and its insane text generation capabilities learning from only a few examples.
The concept of feeding a model with very little training data and making it learn to do a novel task is called Few-shot learning.
A website GPT-3 examples captures all the impressive applications of GPT-3 that the community has come up with, since its release. GPT-3 is shown to generate the whole Frontend code from just a text description of how a website looks like. It is shown to generate a complete marketing copy from just a small…
Imagine that you are a writer and you are searching for the best image that goes with your blog or book. You have a search phrase in mind like “Tiger playing in the snow”. You go onto copyright-free image websites like Pixabay or Unsplash and try out various combinations of keywords like “Tiger”, “Snow”, “Tiger Snow” etc to find relevant images.
If you are lucky you find the exact image that you are looking for on the first page or in the top N retrieved results.
Since the images in these websites have only tags, you are limited by the…
It has been almost three years since I had quit my job in silicon valley, moved back home (India), and took a plunge into entrepreneurship.
The journey accompanied with its ups and downs had spawned a myriad of emotions and thoughts. Sometimes they transformed into what I can call motivational quotes.