GDG Denver: RL Talk

I am giving a Reinforcement Learning at the GDG Denver group. I decided to upgrade my RL notebooks to TF2 and then add some of the TF agents stuff that was announced at Google I/O. As always, this is hosted on my GitHub page https://github.com/ehennis/ReinforcementLearning.

Here is a quick rundown of how I set up the environment to convert my Jupyter Notebooks to TF v2.

Using my old post, to create an environment with a few changes since TF v2 is in beta now.

Commands to setup the environment:

conda create -n gdg_denver python=3.6
activate gdg_denver
pip install tensorflow==2.0.0-beta1
pip install pandas
pip install seaborn
pip install gym
conda install nb_conda

Commands to launch the notebooks:

jupyter notebook

Since I am pretty straight forward in my usage of TF and Keras there wasn’t much to change. Nothing changes as far as ‘import tensorflow as tf‘ goes but we do have to change where we get Keras. That is now ‘from tensorflow import keras

Introducing FishButler!

At the start of the summer I wanted to do something that would keep my nephew (13) and daughter (11) productive and off of YouTube or Fortnite. My plan was to create a fairly simple Android application that they could help me with and get published before they went back to school. FishButler is that application.

I threw them into the deep end with full test coverage (unit and user) and source control. For testing, I used the built in testing frameworks in Android Studio. I had them create branches and tags and pull requests in GitHub to make sure they got to experience source control. I wanted them to get the full development life-cycle.

To get started, I took a day off of work and had them sit around a table with their computers and we had a mini hackathon. After that they went on their ways and handled the GitHub tasks I assigned them.

My nephew had done some intro programming but nothing at this level. My daughter had done nothing more than watching me. To help, I created 10 documents that covered coding standards as well as Git commands. For their tasks I did a fairly detailed write-up that they could see a high level overview of the step and then the commands to do them.

In the future, I would like to add some image recognition software and add in some other external items like Maps and reports. I will also have them do some more open ended research tasks to get a better feel of Android as a whole.