This post introduces a new library I have developed called Lyrics Genius. It allows users to generate song lyrics as if they were composed by a certain singer or group. Lyrics Genius is available on github. Link to repository.
Since I started working, I haven’t had much time to do kaggle competitions. However, competing in kaggle is a great way to stay in form (in terms of data science skills, of course) and to try out new techniques. Therefore, I set myself the goal of joining a competition this week and to try out a new model.
I often find myself competing in kaggle and training classifiers on my local machine. Kaggle’s notebooks have a sluggish performance most of the time and training a model on a laptop can be painfully slow. This can take the fun out of a kaggle competition as you wait for a long time just to realise that you forgot to change a hyperparameter.
A while ago, on a Master’s course, I had to do a project using Convolutional Neural Networks. In the beginning it seemed to be a daunting task but it ended up being a fun and exciting project. The goal of the project was to correctly classify aerial image pixels into road and non-road pixels. Although it might seem simple in the beginning, there are several features and characteristics of images of roads that make it hard to classify them correctly.
This post is about gradient descent algorithms and the different variants and optimizations that exist in order to make it converge faster or make it appropriate for certain environments.
In this document, I have tried to provide a simple introduction to Lambda Calculus with several examples. I have also included some important problems regarding computation solved using lambda calculus such as the Decision Problem and the Turing Completeness of Lambda Calculus.
The following project is a simple implementation of a path planner for a robot. It combines global navigation and local navigation.