NLP Tutorials — Part 27: Topic Modelling

Hello and welcome back to the NLP Tutorials blog post series. We shall cover an interesting topic in the unsupervised learning section under NLP — Topic Modelling. Topic Modelling is a process wherein the algorithm is able to automatically detect the topics covered/occuring in a given text extract/document. It can segregate the topics in a vector space and provides access to them via ‘topic clusters’ for viewing the terms contributing towards a topic. One major application would be to have a system to find out the latest trending topic or emerging topic for a big corpus.

Contents

Welcome to the Applied Singularity blog. Use this Contents post to browse through the full list of articles and Guided Learning Modules we have created or find specific topics of interest.

NLP Tutorials — Part 26: Infinite Transformer

Hello and welcome back to the NLP Tutorials series. In the last two articles, we have looked into a few applications in the NLP domain ,  NER and Summarization, which can be found in various real-world settings. In this article, we shall get back to the understanding one of the latest and most interesting architecture s -  Infinite-former aka Infinite Transformer! The paper vouches for an attention mechanism which can cater to unbounded long-term contextual memory, which is groundbreaking since many architectures have tried solving the complexity and memory constraints of vanilla transformers albeit with a limited memory (longer sequences but limited). The authors also introduced sticky memories which are able to model very long contexts with a fixed computational requirement. 

NLP Tutorials — Part 25: Text Summarization

Welcome back to another article in the NLP Tutorials series! Continuing our quest towards mastery in NLP, we will be looking at an exciting application in NLP — Text Summarization. In the current times where data is being generated at a massive scale, at times we want to shrink them and have an overview only than the entire length. This is where text summarization plays a key role in condensing the document/data into a concise form. It is a challenging problem to solve since it depends on the cognitive intellect, language understanding and domain knowledge. In this article, we shall have a brief overview on the types of Text Summarization and attempt to implement a basic model ourselves using the NLP concepts and libraries we have come across so far.

NLP Tutorials — Part 24: Named Entity Recognition

In this article we won’t be looking at high-end architectures but instead explore a key concept in the NLP domain — Named Entity Recognition (NER). We might have heard of this term as one of the important concepts that has a lot of applications in real-world scenarios. Let’s understand what it means and how we can create a NER model using a few popular NLP libraries.