Day 2 TensorFlow voter data project environment setup

Things are moving along. Right now I have 23 days left to submit my current project to the ⚡#PoweredByTF 2.0 Challenge!. That should not be a problem, but you never know what is going to happen. Right now I am fully registered and I setup a new GitHub project just for this effort.

  • My environment setup is going to be twofold for this project. My primary efforts will occur on the Google Cloud Platform and my secondary effort will occur on Colaboratory. If you have not heard of Colaboratory, then you are missing out on something awesome. It is a very easy to use free Jupyter notebook environment run by Google.
  • My initial build out included the following efforts on GCP:
    • GCP –> Artificial Intelligence –>  AI Platform –> Notebooks –> TensorFlow –> with GPU (us-west1-b, 4 vCPUs, 15 GB RAM, 1 NVIDIA Tesla K80, 100 GB Disk)
    • That yielded a strange error,  “Quota ‘GPUS_ALL_REGIONS’ exceeded. Limit: 0.0”.
    • The solution was to head out to the IAM & admin section of GCP and look up the quotas for my GPUS_ALL_REGIONS and request access to another GPU
    • I found the right global quota to change: GLOBAL Attribute | GPUS_ALL_REGIONS and asked for a quote increase using the handy dandy edit quotas button at the top of the IAM & admin –> Quotas screen
    • I got help from Julia over at Google support in the matter of a couple hours and then went back and made the notebook environment setup request again
    • Keep in mind that using this setup requires the commitment of funds to work on the project, but I wanted a dedicated GPU all to myself for the stretch run and it seemed like something that would bring me joy
      • The alternative here would be to use Colaboratory instead of paying for a dedicated GPU on GCP
    • Everything is now setup to run TensorFlow in a notebook on my own VM with a sweet NVIDIA Tesla K80 GPU
    • You can grab the instructions I used to true it up to 2.0.0-alpha0 here
  • Setting up on Colaboratory took a matter of seconds
    • Go check out https://colab.research.google.com/
    • I’m going to be running a parallel set of notebooks throughout the whole process. My primary development will occur over on GCP and I’ll test every one of the notebooks on Colaboratory and use the export to GitHub from that environment to save the project off for submission

Day 1 TensorFlow voter data project research

Today I started the process of conducting some literature searches and began digging into what work exists related to people using TensorFlow on voter data. The following observations are my notes entered in real-time for later review. Before I used EndNote to capture all my citations, but this time around I thought it might be better to just write it all down during this first pass at digging into what exists publicly.

Today I signed up for a #PoweredByTF 2.0 challenge contest

Earlier today it seemed like a good idea to sign up for a TensorFlow contest from DevPost that ends in about 27 days. My efforts are going to focus on using the TensorFlow 2.0 Alpha release code to model out changes in voting patterns. I’m going to train the model to look for irregularities. My best guess at the moment is that within the next 27 days I can setup my Google Cloud Platform instance and get the model running. The #PoweredByTF 2.0 Challenge on DevPost! will be a good test of my skills. Maybe jumping into a few coding challenges is a good way to push myself to move forward and enhance my programming toolkit.

Maybe this will be a good reason for me to go on a video diet for the next 27 days. For the most part, I have stopped consuming most social media. Maybe trying to avoid consuming video for the next 27 days is a good logical next step. It could be a good method to go about really digging in and doing some deep learning about TensorFlow. I have done a bunch of certifications and used it for a variety of reasons, but none of them were related to a GitHub project. This will be my first effort to post a bunch of notebooks on GitHub to allow people to replicate my work. Maybe this will be a good test of my coding skills and my paper writing skills at the same time.

Finishing up a few advanced machine learning courses

Throughout the year I have been focusing on some professional development related to advanced machine learning. Today I finished up the Advanced Machine Learning with TensorFlow on Google Cloud Platform 5 course specialization from Coursera and the Google Cloud team. It was fun to really dig in with TensorFlow and learn what the right tools can help you accomplish. It is an amazing time that you can just get online and check out the out the Advanced Machine Learning with TensorFlow on Google Cloud Platform specialization with just a web browser and a small fee. That gives you access to a ton of videos, labs, and quizzes. I really enjoy learning and digging into things. Part of being a true lifelong learning is just jumping in and learning new things. That is something that can be done by reading books and enjoying the written word in other forms. Some of the more complex things like advanced machine learning call for a more hands on approach. That is where the labs provided by Coursera really help me dig into actually using and running advanced machine learning models. My technical knowledge of the subject was stronger than my applied skills. That was something that I needed to work on and have been attacking for a couple hours a night.