## NFL Point Spread Model

After completing the DevPost project I decided I would take what I learned and try and do it with the NFL. I go in knowing that I won’t do very well. But, that didn’t stop me from trying to get close.

Data Collection
My first step was to figure out how to collect data. With the college basketball project I just scraped from the NCAA web site. Since my inputs for that project was just based around scoring it was pretty straight forward. With the NFL, I wanted to use more than just the score. I wanted to grab some offense and defensive stats.

To grab the data, I was able to use nflscrapR from Maksim Horowitz. He was able to collect the game stats for each game since 2009 in a JSON file. I was then able to import that into a C# project I had created. A few hundred formatted lines later I had that data.

But, I wanted more. I wanted data from 2000 forward to increase my training data. For these years, I used Pro Football Reference to get it. While the JSON was nice and easy, this was not. I had to download the HTML and then use some string manipulation to get the data.

I now have 4,848 game to use. Breaking that down into 80/20 splits and I have 3,878 games for training and 970 for testing.

Model Creation
For the model, I again used TensorFlow v2 and Keras. Since this is a regression project I am using MSE (mean squared error) as my loss function but using MAE (mean absolute error) for accuracy metric.

I will most likely be tweaking the network layers and nodes for a while to see if I can increase my accuracy. Right now, I am around 8 points away. I would like to see this under a touchdown (7).

## Master Theorem

Since I have started tutoring college level computer science I have had to relearn a lot of things that I haven’t used since college (both undergrad and masters). One of them is the Master’s Theorem that is used to analyze the running time for divide-and-conquer algorithms.

Every time I look at these I have to take a minute to remind myself how to determine the run times. So, this post is to handle just that.

First, the general form of the equations:

$T(n) = aT(\frac{n}{b}) + f(n)$

This has two parts. The first $aT(\frac{n}{b})$ is the sub problem where the algorithm does the divide-and-conquer. The second part $f(n)$ shows the time it takes to recreate the problem.

There are 3 cases to determine the running time of this algorithm. Each of the is determined by the time it takes to run each part.

Determine the “cost” of each part of the equation:
To determine the sub problem speed you need to solve $log_b a$.
To determine the recreation you need to solve based on ‘c’ which is different for each case.

Case 1: $f(n) = O(n^c)$
Case 2: $f(n) = O(n^c log^k n)$
Case 3: $f(n) = \Omega(n^c)$

Case 1: When the work to combine the problem is dwarfed by the sub problem.

$aT(\frac{n}{b}) > f(n)$

Case 2: When the work to combine the problem is comparable to the sub problem.

$aT(\frac{n}{b}) \approx f(n)$

Case 3: When the work to combine the problem dominates the sub problem

$aT(\frac{n}{b}) < f(n)$

To start I am going to work through a single instance of each (from wikipedia) and then give multiple examples of each.

Case 1 Example:
$T(n) = 8T(\frac{n}{2}) + 1000n^2$
First, we need to determine the variables a, b, and c from f(n).
Here, a = 8 and b = 2.
For c, $f(n) = 1000n^2$. Since we know $f(n) = O(n^c)$. So we get c=2.

For case 1 we need to have $log_b a > c$
Doing the math $log_2 8= 3$ which is greater than 2 from c. This confirms we are in case 1.
Using the formula $T(n) = O(n^c)$ we get $O(n^3)$

Case 2 Example:
$T(n) = 2T(\frac{n}{2}) + 10n$
First, we need to determine the variables a, b, and c from f(n).
Here, a = 2 and b = 2.
For c, $f(n) = 10n$. Since we know $f(n) = O(n^c log^k n)$. So we get c=1.

For case 2 we need to have $log_b a \approx c$
Doing the math $log_2 2= 1$ which is the same as 1 from c. This confirms we are in case 2.
Using the formula $T(n) = O( n^c log^k n )$ we get $O(n log n)$

Case 3 Example:
$T(n) = 2T(\frac{n}{2}) + n^2$
First, we need to determine the variables a, b, and c from f(n).
Here, a = 2 and b = 2.
For c, $f(n) = n^2$. Since we know $f(n) = O(f(n))$. So we get c=2

For case 3 we need to have $log_b a < c$
Doing the math $log_2 2= 1$ which is the same as 2 from c. This confirms we are in case 3.
Using the formula $T(n) = O( f(n) )$ we get $O(n^2)$

## Samples:

Now, I am going to bang through a few examples of each case. Case 1 and 3 are pretty straight forward but we start getting some interesting cases in 2.

Each of these will be broken into 4 sections. The first column will be the formula. The second column will be the results of $log_b a$. The third column will be the value of c. The last column will be the notation.

These were pulled from Abdul Bari’s YouTube channel.

Case 1 Samples:
$T(n) = 2T(\frac{n}{2}) + 1$ -> $log_b a$ = 1 -> c=0 -> $O(n^1)$
$T(n) = 4T(\frac{n}{2}) + 1$ -> $log_b a$ = 1 -> c=0 -> $O(n^2)$
$T(n) = 4T(\frac{n}{2}) + n$ -> $log_b a$ = 2 -> c=1 -> $O(n^2)$
$T(n) = 8T(\frac{n}{2}) + n^2$ -> $log_b a$ = 3 -> c=2 -> $O(n^3)$
$T(n) = 16T(\frac{n}{2}) + n^2$ -> $log_b a$ = 4 -> c=2 -> $O(n^4)$

Case 3 Samples:
$T(n) = T(\frac{n}{2}) + n$ -> $log_b a$ = 0 -> c=1 -> $O(n)$
$T(n) = 2T(\frac{n}{2}) + n^2$ -> $log_b a$ = 1 -> c=2 -> $O(n^2)$
$T(n) = 2T(\frac{n}{2}) + n^2 log n$ -> $log_b a$ = 1 -> c=2 -> $O(n^2 log n)$
$T(n) = 4T(\frac{n}{2}) + n^3 log n$ -> $log_b a$ = 2 -> c=3 -> $O(n^3 log n)$
$T(n) = 2T(\frac{n}{2}) + \frac{n^2}{log n}$ -> $log_b a$ = 1 -> c=2 -> $O(n^2)$

Case 2 Samples:
Remember, you are multiplying f(n) times log n.
$T(n) = T(\frac{n}{2}) + 1$ -> $log_b a$ = 0 -> c=0 -> $O(log n)$
$T(n) = 2T(\frac{n}{2}) + n$ -> $log_b a$ = 1 -> c=1 -> $O(n log n)$
$T(n) = 2T(\frac{n}{2}) + n log n$ -> $log_b a$ = 1 -> c=1 -> $O(n log^2 n)$
$T(n) = 4T(\frac{n}{2}) + n^2$ -> $log_b a$ = 2 -> c=2 -> $O(n^2 log n)$
$T(n) = 4T(\frac{n}{2}) + (n log n)^2$ -> $log_b a$ = 2 -> c=2 -> $O(n^2 log^3 n)$
$T(n) = 2T(\frac{n}{2}) + \frac{n}{log n}$ -> $log_b a$ = 1 -> c=1 -> $O(n log log n)$
$T(n) = 2T(\frac{n}{2}) + \frac{n}{log^2 n}$ -> $log_b a$ = 1 -> c=1 -> $O(n)$

## Conclusion:

I hope this helps anyone that is struggling through this stuff. After writing all this down with pen and paper it cleared it up for me.

## 500lb Deadlift PR

### Workout

I changed up my workout on January 7th of this year to something called Greasing the Groove by Soviet strength coach Pavel Tsatsouline. The idea is that you train as often as possible while still being fresh as possible. You don’t do your typical workout where you wipe out the muscle and your central nervous system.

For me, that is training every day during the week but only doing 2 sets of 5 at a weight that I can do 10 times. For Deadlift, this is 315 and bench press it is 225. I get a little stress from them but I am never sore so I can pick back up the next day. One nice side effect is that I don’t have great grip strength and doing 2×5 at 315 pushes my grip so that is getting better as well.

This workout is perfect for me at my age (38) and goals. I don’t need some crazy CrossFit style workout. First, I don’t need that intensity for what I need to accomplish. More power to those people but that isn’t for me. Second, I need to have a workout that will be able to go with me through life.