The AWD-LSTM has been dominating the state-of-the-art language modeling. All the top research papers on word-level models incorporate AWD-LSTMs. And it has shown great results on character-level models as well (Source). In this blog post, I go through the research paper - Regularizing and Optimizing LSTM Language Models that introduced the AWD-LSTM and try to explain … Continue reading What makes the AWD-LSTM great?
Universal Embeddings of text data have been widely used in natural language processing. It involves encoding words or sentences into fixed length numeric vectors which are pre-trained on a large text corpus and can be used to improve the performance of other NLP tasks (like classification, translation). While word embeddings have been massively popular and … Continue reading A Walkthrough of InferSent – Supervised Learning of Sentence Embeddings
PyTorch is a promising python library for deep learning. I have been learning it for the past few weeks. I am amused by its ease of use and flexibility. In this blog post, I will go through a feed-forward neural network for tabular data that uses embeddings for categorical variables. If you want to understand the … Continue reading A Neural Network in PyTorch for Tabular Data with Categorical Embeddings
Transfer Learning in natural language processing is an area that had not been explored with great success. But, last month (May 2018), Jeremy Howard and Sebastian Ruder came up with the paper - Universal Language Model Fine-tuning for Text Classification which explores the benefits of using a pre-trained model on text classification. It proposes ULMFiT, a transfer … Continue reading Understanding the Working of Universal Language Model Fine Tuning (ULMFiT)
Pivoting a table is a very common operation in data processing. But there is no direct function in BigQuery to perform such operation. To solve this problem I have written a Python module, BqPivot. It generates a SQL query to pivot a table that can then be run in BigQuery. In this blog post, I will … Continue reading How to pivot large tables in BigQuery?
I have studied Java at my high school. When I first started writing Python in my freshman year, I used to mentally translate Java to Python. But after some good amount of open source exposure, I figured that Python is way cleaner and idiomatic than Java. In this blog post, I discuss 6 things I wish … Continue reading 6 Things Every Beginner Should Know To Write Clean Python Code
Deep learning algorithms require a huge amount of training data. This makes us put more and more labeled data into our training set even if it does not belong to the same distribution of data we are actually interested in. For example, let's say we are building a cat classifier for door camera devices. We … Continue reading What to do when we have mismatched training and validation set?
Earlier, I was of the opinion that getting computers to recognize images requires - huge amount of data, carefully experimented neural network architectures and lots of coding. But, after taking the deep learning course - fast.ai, I found out that it is not always true. We can achieve a lot by writing just a few lines … Continue reading Hotdog or Not Hotdog – Image Classification in Python using fastai
Activation functions are an integral component in neural networks. There are a number of common activation functions. Due to which it often gets confusing as to which one is best suited for a particular task. In this blog post I will talk about, Why do we need activation functions in neural networks? Output layer activation … Continue reading Which activation function to use in neural networks?
As promised, this is the second post on my two part blog series on time series modelling and forecasting. In my first blog post I discussed the basics of time series analysis and gave a theoretical overview. In case you missed it you can find it here - Understanding Time Series Modelling and Forecasting, Part 1 … Continue reading Understanding Time Series Modelling and Forecasting, Part 2