Machine Learning Blog Posts
Songs
Your browser does not support the audio element. Phantom Klowns Duck Song Tetris Pink Panther
Notes For – Building makemore Part 5: Building a WaveNet
makemore: part 5¶ In [1]: import torch import torch.nn.functional as F import matplotlib.pyplot as plt # for making figures %matplotlib inline In [2]: # read in all the words words = open('names.txt', 'r').read().splitlines() print(len(words)) print(max(len(w) for w in words)) print(words[:8]) 32033 15 ['emma', 'olivia', 'ava', 'isabella', 'sophia', 'charlotte', 'mia', 'amelia'] In [3]: # build the vocabulary of characters […]
What are we going to do when the machines are better than us at everything?
Pace of AI has been insane. Humans have challenges. Health, alertness, impairment, dietary needs, bathroom breaks, repetitive strain injuries etc. Robots just go as long as the electrons flow. But what about the mind? Certainly, we will be more creative than the robots, right? Nope. Not only can a program generate art in seconds (what […]
Notes for – Building makemore Part 4: Becoming a Backprop Ninja
makemore: becoming a backprop ninja¶ swole doge style In [52]: # there no change change in the first several cells from last lecture In [53]: import torch import torch.nn.functional as F import matplotlib.pyplot as plt # for making figures %matplotlib inline In [54]: # read in all the words words = open('names.txt', 'r').read().splitlines() print(len(words)) print(max(len(w) for w in […]
Jungle Beats
I could have the baddest jungle beats. Why do I not have them yet? Because I have not trained my neural net.
Tesla Stock Prediction 2023-03-21
In [483]: # This Python 3 environment comes with many helpful analytics libraries installed # It is defined by the kaggle/python Docker image: https://github.com/kaggle/docker-python # For example, here’s several helpful packages to load import numpy as np # linear algebra import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv) # Input data files […]
Loss Function/Training Loss
In Machine Learning, the loss function acts as an accuracy metric. The lower the loss, the higher the capability of predicting correctly. That is, it is inversely proportional to accuracy.