Blog Posts

March 21st, 2023
  • Machine Learning
  • Uncategorised

Loss Function/Training Loss

In Machine Learning, the loss function acts as an accuracy metric. The lower the loss, the higher the capability of predicting correctly. That is, it is inversely proportional to accuracy.

March 10th, 2023
  • Code Fun
  • Machine Learning
  • Pytorch
  • Speculation
  • Tesla

Training Tesla Stock

Credit to Rodolfo Saldanha’s Stock Price Prediction with PyTorch. I just changed this from Amazon to Tesla. In [2]: # This Python 3 environment comes with many helpful analytics libraries installed # It is defined by the kaggle/python Docker image: https://github.com/kaggle/docker-python # For example, here’s several helpful packages to load import numpy as np # linear […]

March 7th, 2023
  • Chat GPT
  • Creation
  • First Principles
  • Midjourney
  • Video

Optimizing Music Video Creation

Music videos tell a story through art. Whether it is the lyrics themselves or the visual imagery, humans can connect with music videos emotionally. First Principles Problem A music video is a sequence of frames synchronized to some minutes and seconds of audio. Assuming a music video could be instantly created by a nerual net, […]

March 7th, 2023
  • Chat GPT
  • Creation
  • Machine Learning

10 Underserved Disruptive Product Ideas

Certainly! Here are ten examples of underserved business plans that could benefit from a path of disruption through continuous engineering improvements: Affordable renewable energy solutions for developing countries Sustainable and eco-friendly packaging for food and consumer goods Autonomous vehicles for public transportation and ride-sharing services Low-cost and sustainable housing solutions for low-income families Electric and […]

March 2nd, 2023
  • Midjourney
  • Racarslin

Midjourney Fun

There once was a baby raccoon who loved snuggling his mommy. His name was Racarslin.

February 27th, 2023
  • Machine Learning
  • Pytorch

Notes for – Building makemore Part 3: Activations & Gradients, BatchNorm

Building makemore Part 3 Jupyter Notebook makemore: part 3¶ Recurring Neural Networks RNNs are not as easily optimizable with first order gradient techniques we have available to us and the key to understanding why they are not optimizable easily is to understand the activations and their gradients and how they behave during training. In [2]: import […]

February 21st, 2023
  • Model 3 Improvements
  • Self Driving
  • Speculation
  • Tesla

Tesla Full Self Driving

Currently it costs in the USA ~ $684/mo to drive 1000 miles with FSD subscription, insurance and energy costs. This works out to $0.684(0.69)/mile. How odd. Most cars sit around doing not that much, most of the time. As you decrease the number $684, you broaden the available market. Even starting with a couple buying […]

February 16th, 2023
  • Bad Defaults
  • The Gimp

Gimp’s default screen grab

This is totally useless. It immediately takes a picture of the window. Which is Gimp. The “Select a region to grab” option would be a better default.

January 23rd, 2023
  • Machine Learning
  • Pytorch

Notes for – Building makemore Part 2: MLP

Attempting to scale normalized probability counts grows exponentially You can use Multi Layer Perceptrons (MLPs) as a solution to maximize the log-likelihood of the training data. MLPs let you make predictions by embedding words close togehter in a space such that knowledge transfer of interchangability can occur with good confidence. With a vocabulary of 17000 […]


12345678910111213141516171819202122232425262728293031323334

Partners & Resources


© 2017 DevElevation.com
Planet Earth
Privacy Policy