What makes the AWD-LSTM great?

The AWD-LSTM has been dominating the state-of-the-art language modeling. All the top research papers on word-level models incorporate AWD-LSTMs. And it has shown great results on character-level models as well (Source). In this blog post, I go through the research paper - Regularizing and Optimizing LSTM Language Models that introduced the AWD-LSTM and try to explain … Continue reading What makes the AWD-LSTM great?

InferSent supervised learning of sentence embeddings

A Walkthrough of InferSent – Supervised Learning of Sentence Embeddings

Universal Embeddings of text data have been widely used in natural language processing. It involves encoding words or sentences into fixed length numeric vectors which are pre-trained on a large text corpus and can be used to improve the performance of other NLP tasks (like classification, translation). While word embeddings have been massively popular and … Continue reading A Walkthrough of InferSent – Supervised Learning of Sentence Embeddings

Neural Network in PyTorch for Tabular Data with Categorical Embeddings

A Neural Network in PyTorch for Tabular Data with Categorical Embeddings

PyTorch is a promising python library for deep learning. I have been learning it for the past few weeks. I am amused by its ease of use and flexibility. In this blog post, I will go through a feed-forward neural network for tabular data that uses embeddings for categorical variables. If you want to understand the … Continue reading A Neural Network in PyTorch for Tabular Data with Categorical Embeddings

What to do when we have mismatched training and validation set?

Deep learning algorithms require a huge amount of training data. This makes us put more and more labeled data into our training set even if it does not belong to the same distribution of data we are actually interested in. For example, let's say we are building a cat classifier for door camera devices. We … Continue reading What to do when we have mismatched training and validation set?

time series modelling and forecasting arima

Understanding Time Series Modelling and Forecasting, Part 2

As promised, this is the second post on my two part blog series on time series modelling and forecasting. In my first blog post I discussed the basics of time series analysis and gave a theoretical overview. In case you missed it you can find it here - Understanding Time Series Modelling and Forecasting, Part 1  … Continue reading Understanding Time Series Modelling and Forecasting, Part 2

Time series modelling ARIMA

Understanding Time Series Modelling and Forecasting – Part 1

Time series forecasting is extensively used in numerous practical fields such as business, economics, finance, science and engineering. The main aim of a time series analysis is to forecast future values of a variable using its past values. In this post, I will give you a detailed introduction to time series modelling. This would be the … Continue reading Understanding Time Series Modelling and Forecasting – Part 1

categorical variable one hot encoding

How to One Hot Encode Categorical Variables of a Large Dataset in Python?

In this post, I will discuss a very common problem that we face when dealing with a machine learning task - How to handle categorical data especially when the entire dataset is too large to fit in memory? I will talk about how to represent categorical variables, the common problems we face while one hot … Continue reading How to One Hot Encode Categorical Variables of a Large Dataset in Python?