Semester-long graduate school class, taken at Johns Hopkins University - weekly assignment descriptions below.
Week 1 - Introduction to Machine Learning and Neural Networks (no corresponding files)
Recieved an overview of the architecture of the course as well as an introduction to machine learning and neural networks.
Week 2 - Python + Pandas + Numpy Basics
Became comfortable with using Python, Jupyter Notebooks, Pandas, and Numpy.
Week 3 - Calculus and Linear Algebra's Roles in Machine Learning
In this module I explored the role of Calculus in Machine Learning, namely in optimization to find the best fit model. Additionally, I looked at how all equations are expanded from single variable inputs to 'multivariable' through Linear Algebra. I explored how Linear Regression uses Calculus and finds the best fit model through optimization in both one-dimension along with multiple dimensions.
Week 4 - Practical Linear Regression and Cost Functions
I focused on Linear Regression in a more rigorous and practical method. I explored different ways to use Linear Regression along with a real dataset. I looked at the cost function and how the choice of function can improve or hurt results.
Introduced to train/test/validate in the development of robust models.
Week 6 - From Regression to Classification: Logistic Regression
Introduced to concept of classification. The Logistic Function emphasized.
Week 7 - Garbage In, Garbage Out (Data Munging): Extracting Relevant Features from the Data
Exposed to techniques to help clean data. Allowed data to properly into an algorithm to generate the most effective model.
Week 9 - Unsupervised Learning Part 1
Exposed to algorithms that do not require supervision.
Week 10 - Natural Language Processing (NLP)
In this module I covered working with Text Data and conversions such as TF-IDF and Word2Vec. These techniques allowed text data to be converted into (logical) numerical data, allowing the data to be modeled.
Week 11 - Neural Networks: From Support Vector Machines to Logistic Regression
In this module I made the jump to neural networks. Introduced to the diagrams and graphs that represent these networks and some of the math behind them. Shown how adding additional layers or additional neurons can improve performance over a Support Vector Machine.
Week 12 - Neural Networks (Deep Learning)
Introduced to different Neural Networks. I explored when to use each architecture and potential models for specific situations.