Getting a new Chromebook is probably going to happen

My ASUS Flip Chromebook has seen better days. A few of the keys have failed and that is problematic. I keep thinking that waiting until the next generation of Pixelbooks come out is the right move. That means that I have to keep switching between the touchscreen keyboard and the physical keyboard depending on what I have been trying to type. Tonight, I started looking around the Best Buy website for touchscreen Chromebooks. Every time I see the price of a Google Pixelbook it really does make me wonder. I’m not entirely sure how they came up with the price on that build or how it still has such an outlandish price. My focus has pretty much been focused on maybe buying the ASUS Flip C434 as a replacement. That would be my third purchase of an ASUS Chromebook and that seems to be ok. The only thing that I seem to break on these devices is the keyboard.

My notes from AWSome Day Denver 2019

It seemed like a good idea to take the day off and learn something new. That is exactly what I ended up doing today. Here are my notes and thoughts from AWSome Day Denver 2019 that was hosted on April 16, 2019 at the Denver Convention Center. My notes are a mix of my thoughts and reactions to the things I heard throughout the day.

  • Local communities in Denver that do AWS
  • Large footprint 20 regions, 61 AZs, 166 pops
  • Lots of machine learning service growth in Denver (chatbots)
  • 10 minutes on what is the cloud… being elastic… acquire resources when you need them
  • Talked a bit about Iaas, PaaS, and SaaS (Infrastructure, Platform, and Software)
  • Cloud deployment models: private, hybrid, cloud (no multi cloud was mentioned)
  • Funny tshirt mention: “Friends don’t let friends build data centers…”
  • Every region will have two availability zones (most have 3 or more). Each zone increases the availability…
  • Spent a few minutes on Amazon virtual private clouds. Isolated from all networks…
  • People seem very excited about EC2. Most of the talking points centered on being able to change hardware or configurations and not being locked into our initial build
  • You always own the data. The data is yours was a central message that kept getting anchored within the presentation
  • Containers vs. virtual machines (more people are using containers), talked about orchestration platforms and how they fit into the puzzle. Talked about AmazonEKS and Kubernetes.
  • They did a live demo of how to setup a box…
  • Talked about S3 and how storage solutions work and are using in practice
  • Reviewed the tools of auto scaling and how to build or manage elastic elements

Day 2 TensorFlow voter data project environment setup

Things are moving along. Right now I have 23 days left to submit my current project to the ⚡#PoweredByTF 2.0 Challenge!. That should not be a problem, but you never know what is going to happen. Right now I am fully registered and I setup a new GitHub project just for this effort.

  • My environment setup is going to be twofold for this project. My primary efforts will occur on the Google Cloud Platform and my secondary effort will occur on Colaboratory. If you have not heard of Colaboratory, then you are missing out on something awesome. It is a very easy to use free Jupyter notebook environment run by Google.
  • My initial build out included the following efforts on GCP:
    • GCP –> Artificial Intelligence –>  AI Platform –> Notebooks –> TensorFlow –> with GPU (us-west1-b, 4 vCPUs, 15 GB RAM, 1 NVIDIA Tesla K80, 100 GB Disk)
    • That yielded a strange error,  “Quota ‘GPUS_ALL_REGIONS’ exceeded. Limit: 0.0”.
    • The solution was to head out to the IAM & admin section of GCP and look up the quotas for my GPUS_ALL_REGIONS and request access to another GPU
    • I found the right global quota to change: GLOBAL Attribute | GPUS_ALL_REGIONS and asked for a quote increase using the handy dandy edit quotas button at the top of the IAM & admin –> Quotas screen
    • I got help from Julia over at Google support in the matter of a couple hours and then went back and made the notebook environment setup request again
    • Keep in mind that using this setup requires the commitment of funds to work on the project, but I wanted a dedicated GPU all to myself for the stretch run and it seemed like something that would bring me joy
      • The alternative here would be to use Colaboratory instead of paying for a dedicated GPU on GCP
    • Everything is now setup to run TensorFlow in a notebook on my own VM with a sweet NVIDIA Tesla K80 GPU
    • You can grab the instructions I used to true it up to 2.0.0-alpha0 here
  • Setting up on Colaboratory took a matter of seconds
    • Go check out https://colab.research.google.com/
    • I’m going to be running a parallel set of notebooks throughout the whole process. My primary development will occur over on GCP and I’ll test every one of the notebooks on Colaboratory and use the export to GitHub from that environment to save the project off for submission

Day 1 TensorFlow voter data project research

Today I started the process of conducting some literature searches and began digging into what work exists related to people using TensorFlow on voter data. The following observations are my notes entered in real-time for later review. Before I used EndNote to capture all my citations, but this time around I thought it might be better to just write it all down during this first pass at digging into what exists publicly.

Today I signed up for a #PoweredByTF 2.0 challenge contest

Earlier today it seemed like a good idea to sign up for a TensorFlow contest from DevPost that ends in about 27 days. My efforts are going to focus on using the TensorFlow 2.0 Alpha release code to model out changes in voting patterns. I’m going to train the model to look for irregularities. My best guess at the moment is that within the next 27 days I can setup my Google Cloud Platform instance and get the model running. The #PoweredByTF 2.0 Challenge on DevPost! will be a good test of my skills. Maybe jumping into a few coding challenges is a good way to push myself to move forward and enhance my programming toolkit.

Maybe this will be a good reason for me to go on a video diet for the next 27 days. For the most part, I have stopped consuming most social media. Maybe trying to avoid consuming video for the next 27 days is a good logical next step. It could be a good method to go about really digging in and doing some deep learning about TensorFlow. I have done a bunch of certifications and used it for a variety of reasons, but none of them were related to a GitHub project. This will be my first effort to post a bunch of notebooks on GitHub to allow people to replicate my work. Maybe this will be a good test of my coding skills and my paper writing skills at the same time.