Notes on “Calculus Blue” Volume 1, Chapter 1

These notes are on the Calculus Blue videos by Ghrist on YouTube.  He emphasizes that the math will involve substantial (and worthwhile) work, which I really appreciate.

01 (0:51) “Vectors & matrices: Intro”  “Your journey is not a short one”.  To learn “calculus, the mathematics of the nonlinear”, prepare with “the mathematics of the linear”.

01 (3:25) “Prologue” definition of multivariate (multiple inputs and multiple outputs).  Asks why do we care?  (graphs of surfaces, arbitrary dimensions).  linear algebra will “help us do calculus”.  “Calculus involves approximating nonlinear functions with linear functions”, so start with “the mathematics of linear multivariable functions”.

Why learn about vectors and matrices?  machine learning, statistics, information from data, geometry (distance, area, volume), determinants will help calculate areas and volumes.

algebra + work + fun.

01.01.00 (0:35) “Lines & planes: intro”.

01.01.01 (3:50) “Formulae for lines & planes”.  Lines in the plane: y = mx + b, (y-y0) = m(x-x0) (point slope form), x/a + y/b = 1 (intercept form).

Example: a line passing through a point with a particular slope; a line passing through two points.

Orthogonal: (the orthogonal slope is the negative reciprocal).

01.01.02 (3:33)  “Implicit planes in 3d”.  These are analogous to lines in the plane.  n1(x-x0) + n2(y-y0)+n3(z-z0) = 0 (point slope form).  x/a + y/b + z/c = 1 (intercept form) –> he says intercept form shows up in economics.

example: equation of a plane passing through a point and parallel to another plane.

01.01.03 (5:21) “parameterized lines in 3d”.  add an “auxiliary variable” (a parameter).  x(t) = 3r-5, y(r) = r+3, z(r) = -4r+1.  The name of the parameter doesn’t matter, and shifts or changes to the parameter that happens in all three equations doesn’t matter.

Examples: find a line through two points in 3-space; find a line orthogonal to a plane and through a specific point.

What will happen in higher dimensions?  “hyperplanes”, “subspaces”.

01.01.04 (1:52) “Bonus! Machine learning”.  hyperplanes come up in analyzing data.  A space of images.  A “support vector machine is a hyperplane that optimally separates two types of data points”.  The video illustrates how flat planes usually won’t cut it to separate two datasets, so we’ll need nonlinear ideas (i.e. calculus).

01.01 (0:25) “The big picture”: lines and planes are the start of the story!