Getting Ubuntu and my GTX 1060 Setup for TensorFlow

Things on the TensorFlow installation front did not go as planned yesterday. The Ubuntu 16.04.1 installation worked at first. I was able to grab an ISO and use Rufus to load it on a USB drive. Things loaded without any hiccups, but my GeForce GTX 1060 graphics card was not recognized. I tried several different methods to get it to work. All of them resulted in what some folks in the online forums called an NVidia login loop. After the PPA and driver installation Ubuntu would load, but nothing happened after successfully logging into the system. I gave up on installing Ubuntu 16.04.1 and moved to working with the Ubuntu Studio flavor. That change in course made all the difference.

Installation Steps Taken

1. Downloaded Ubuntu Studio 16.04.1 ISO using my Windows 10 base machine
2. Dropped the ISO on a USB via Rufus
3. Setup drivers for Geforce GTX 1060 via sudo add-apt-repository ppa:graphics-drivers/ppa
4. Ran sudo apt-get update && sudo apt-get distro-upgrade
5. Rebooted the entire computer
6. Logged back into the system and opened additional drivers
7. Picked 367.44 from Nvidia-367
8. Clicked apply changes
9. Rebooted
10. loaded with no Nvidia login loop 😉

My next course of action was to get Google Chrome installed from https://www.google.com/chrome/browser/desktop/index.html.

Saturday Learning TensorFlow Fun Day

Our friends at Google released an open source library for TensorFlow last year. That effort is greatly appreciated. It provides folks throughout the world with the opportunity to get in and get going without starting from scratch. In the world of machine learning that is an important nudge. Normally, I like to be first in the pool for new technology. This time around I am about a year late to the TensorFlow party. Today that party is about to get started with some tutorials and maybe even some research about completing a nanodegree later this year. For the next five days I will be focused on learning how to utilize TensorFlow. Yep. That is right. Three days from this upcoming week were scheduled for vacation just to focus on learning how to use TensorFlow. Some of that time will also be spent cleaning the house. I cannot program for 72 hours straight anymore. That type of sustained single minded purpose may not happen anytime soon. I can work on something every day, but marathon efforts are probably a think of the past.

Thinking about headphones

People are hustling this morning. The airport was buzzing this morning. It seemed like people were in a hurry. That happens sometimes.  The Einstein bros bagels store did not have a very long line. I was able to purchase a power protein bagel and some espresso without any real challenges.

Over the last year, I have been pretty satisfied with my Klipsch R6i earbuds. They replaced my classic Bose noise cancelling headphones. I have been eyeing the Bose wireless Bluetooth headphones. However, I have not wanted to carry them around. Traveling all over the country involves careful packing. My switch to earbuds was purely about ease of use. Having a microphone for phone calls was also a huge driving factor for the switch. The switch has worked out well enough. On longer flights it would be nice to have the headphones again. They really do limit the amount of background noise.

I snag earbud wires on things from time to time. The last pair I destroyed got caught on a door handle. I could feel the wire stretch. That is pretty much the hallmark of doom for earbuds. My current pair of Klipsch earbuds have survived my travels.

I’m about ready to renew my research interests. Now seems as good a time as any to pick up where I left off. My interests had focused on a microtargeting via automated survey research. It started off as a pretty simple research project. I built a basic web crawler that pulled in content to either a single file or a series of structured daily files. Those files could be crawled for certain cuts of data. It is pretty easy to build a summary of the content. You can even bump the files up against positive and negative scores for each word. I never could get a daily top 10 list script to work. I had wanted it to deliver me a list of the top 10 topics and a degree of positive or negative context.

That is pretty much where I left off. I’m considering a rebuild that script. It could be an interesting thing to figure out. It would be a challenge. It is something that I believe could be finished before my birthday this year.

An airport view
An airport view

More adventures in coding

Yesterday was a productive day. Some coding happened. A few Java errors were even created and resolved. Some real problem solving occurred. For better or worse, it was a fun experience to focus on coding for a few hours. Unfortunately, about lunch time I lost focus and started thinking about both food and the University of Kansas first round NCAA tournament game.

  • 7:30 AM – Made a cup of Starbucks house blend coffee
  • 8:00 AM – Started some laundry
  • 8:23 AM – Fed Peppercorn the dog breakfast
  • 8:30 AM – Grabbed a bottle of water and a slice of cold pizza from the kitchen
  • 10:10 AM – Still ramping up to full productivity

A coding adventure

Taking the day off work required some planning. Today is the first real day of March Madness. That spark was enough to create the opportunity. Getting setup to spend the entire day listening to music and coding was the hard part.

  • 6:30 AM — Dropped off the kids at school
  • 6:45 AM — Acquired a power protein bagel
  • 7:02 AM — Made two shots of Starbucks verismo Guatemala Antigua espresso
  • 7:14 AM — Booted up the Pandora streaming service and turned up the volume on my Audio Engine A5+ speakers
  • 7:15 AM — Opened a bottle of lemonade flavored Powerade and a1 liter bottle of Smartwater
  • 7:21 AM — Started coding the first level of my new puzzle game
  • 9:08 AM — Started coding the second level of my new puzzle game. The first level seems to work
  • 10:44 AM — Started coding the third level of the game. My progress has slowed down. I have already started thinking about lunch.