Introduction to Question Answering over Knowledge Graphs

Question answering is a very popular natural language understanding task. It has applications in a wide variety of fields such as dialog interfaces, chatbots, and various information retrieval systems. Answering questions using knowledge graphs adds a new dimension to these fields. “Question answering over knowledge graphs (KGQA) aims to provide the users with an interface… Continue reading Introduction to Question Answering over Knowledge Graphs

BERT Explained – A list of Frequently Asked Questions

What is BERT? BERT is a deep learning model that has given state-of-the-art results on a wide variety of natural language processing tasks. It stands for Bidirectional Encoder Representations for Transformers. It has been pre-trained on Wikipedia and BooksCorpus and requires task-specific fine-tuning. What is the model architecture of BERT? BERT is a multi-layer bidirectional… Continue reading BERT Explained – A list of Frequently Asked Questions

How can Unsupervised Neural Machine Translation Work?

Neural Machine Translation has arguably reached human-level performance. But, effective training of these systems is strongly dependent on the availability of a large amount of parallel text. Because of which supervised techniques have not been so successful in low resource language pairs. Unsupervised Machine Translation requires only monolingual corpora and is a viable alternative in… Continue reading How can Unsupervised Neural Machine Translation Work?

A Disciplined Approach to Neural Network Hyper-Parameters: Learning Rate, Batch Size, Momentum, and Weight Decay – Paper Dissected

Training a machine learning algorithm requires carefully selecting hyper-parameters. But with neural networks, this can easily go out of control with so many things to tune. Besides, the optimal values of these parameters vary from one dataset to another.  Leslie N. Smith in his paper - A Disciplined Approach to Neural Network Hyper-Parameters: Part 1 -… Continue reading A Disciplined Approach to Neural Network Hyper-Parameters: Learning Rate, Batch Size, Momentum, and Weight Decay – Paper Dissected

What makes the AWD-LSTM great?

The AWD-LSTM has been dominating the state-of-the-art language modeling. All the top research papers on word-level models incorporate AWD-LSTMs. And it has shown great results on character-level models as well (Source). In this blog post, I go through the research paper - Regularizing and Optimizing LSTM Language Models that introduced the AWD-LSTM and try to explain… Continue reading What makes the AWD-LSTM great?

A Walkthrough of InferSent – Supervised Learning of Sentence Embeddings

InferSent supervised learning of sentence embeddings

Universal Embeddings of text data have been widely used in natural language processing. It involves encoding words or sentences into fixed length numeric vectors which are pre-trained on a large text corpus and can be used to improve the performance of other NLP tasks (like classification, translation). While word embeddings have been massively popular and… Continue reading A Walkthrough of InferSent – Supervised Learning of Sentence Embeddings

A Neural Network in PyTorch for Tabular Data with Categorical Embeddings

Neural Network in PyTorch for Tabular Data with Categorical Embeddings

PyTorch is a promising python library for deep learning. I have been learning it for the past few weeks. I am amused by its ease of use and flexibility. In this blog post, I will go through a feed-forward neural network for tabular data that uses embeddings for categorical variables. If you want to understand the… Continue reading A Neural Network in PyTorch for Tabular Data with Categorical Embeddings

Understanding the Working of Universal Language Model Fine Tuning (ULMFiT)

Universal Language Model Fine Tuning ulmfit for text classification

Transfer Learning in natural language processing is an area that had not been explored with great success. But, last month (May 2018), Jeremy Howard and Sebastian Ruder came up with the paper - Universal Language Model Fine-tuning for Text Classification which explores the benefits of using a pre-trained model on text classification. It proposes ULMFiT, a transfer… Continue reading Understanding the Working of Universal Language Model Fine Tuning (ULMFiT)

How to pivot large tables in BigQuery?

Pivoting a table is a very common operation in data processing. But there is no direct function in BigQuery to perform such operation. To solve this problem I have written a Python module, BqPivot. It generates a SQL query to pivot a table that can then be run in BigQuery. In this blog post, I will… Continue reading How to pivot large tables in BigQuery?

6 Things Every Beginner Should Know To Write Clean Python Code

I have studied Java at my high school. When I first started writing Python in my freshman year, I used to mentally translate Java to Python. But after some good amount of open source exposure, I figured that Python is way cleaner and idiomatic than Java. In this blog post, I discuss 6 things I wish… Continue reading 6 Things Every Beginner Should Know To Write Clean Python Code