Joining YNAB

YNAB_logoA few months ago, I was catching up on new articles in Feedly and stumbled across a developer job posting at  Go read that job posting because it is one of the best I’ve ever seen, and believe me when I say I’ve read a lot of job postings!  It’s hard not to be interested in a job after reading a description like that.  Besides all the fridge benefits of working for YNAB (5 weeks vacation, annual meet-up, 401k, working remotely, very flexible schedule, etc.) the role itself sounded like a perfect fit for me.  Working on a SaaS web budgeting application, working with a stack I have grown to love in recent years (Ruby/Rails/Postgres/JavaScript), getting to help with interesting architectural challenges and getting to do some some DevOps work once in awhile.  Also, I have used the YNAB product in the past and knew very well it was of top-notch quality.  Did I mention I’m a budgeting junkie?  As I read through the description I kept thinking, “I want this job”.

So, I applied!

Not surprisingly, they had a lot of interest.  They have a passionate user base and are smart about how they market opportunities to join their team so I had sit back and wait a bit while they sorted through all the applications.  I didn’t realize it but there were to be 9 rounds (9!) of interviews so patience was my friend during the wait.  It was a straightforward, friendly interview process, however. Most of the rounds consisted of casual chats with different members of the team and as the process progressed, there was more technical assessment involved.  Each step along the way confirmed this was a good fit from my side and I continued to hope the feeling was mutual.

So, I got the job!

I was thrilled when I got the news because I really did want to join the YNAB team.  They have a super sharp team in place that I wanted to be a part of; to grow, learn and employ my experience on an application that changes lives and saves marriages.  YNAB is going places and I want to be on the ride!

Google Calendar API from Ruby

I have a Rails app that needs to sync events to a Google Calendar and I had been using the gcal4ruby gem to do this.  However, back in November 2014, Google deprecated the v1 and v2 Google Calendar API.  This gem does not use the newer v3 API so I had to make a change.

I decided to use the official Google API Client for Ruby.  It took a bit of time to get up to speed with using this client, particularly because of the OAuth authentication requirement in v3 of the API.

The steps to get setup are:

  1. Go to Google Developers Console –
  2. Create a new Project
  3. Go to Project > APIs & Auth > Credentials
  4. Create New Client ID (OAuth).  Pay attention to the Authorized Redirect Uris; these are the Uris allowed to be redirected to when requesting authorization via OAuth.  This (or these) Uris should be a page you can receive a GET request on after OAuth authorization is given by a Google Account owner.
  5. Click Download JSON.  This will download a JSON file with all the app credentials (token / secret) needed to authenticate when connecting with the client.
  6. Rename the file downloaded above to client_secrets.json and move it to the same directory your app is running.

Google Developer Console

Once you create the app and get your client secrets file in place, you are ready to start using the client.  Here is some example Ruby code that demonstrates how to connect, authenticate, and perform basic Google Calendar API operations like querying for a list of events, adding, and updating.

SQL Server Migrations Done Right

When working in the .NET / SQL Server world, I’ve always been envious of Rails Active Record Migrations.  It seems there is no great way to handle migrations with SQL Server.  Sure, there are some tools out there but it seems all of them leave something to be desired.  Specifically:

  • SQL Server Database Projects in Visual Studio is a full featured database solution but unfortunately takes a “State” approach to keeping databases up to date.  For one-off sync scenarios it is a perfect solution but for incremental database changes it is less than ideal.  In my opinion (and in my experience) this is a problematic approach because many changes to databases are transitional rather than stateful in nature.
  • Entity Framework Code First Migrations is fairly solid migration framework but unfortunately only works for objects that have an Entity Framework DbContext defined for.  If you are working with an existing, large database where EF is not used throughout, you are out of luck.  Also, it does not work with Stored Procedures.
  • RoundhousE is another full featured solution that is well thought out but I find it a bit cumbersome to work with and think it is overly complex in nature.  It lacks Package Manager Console scripts and is a little tricky to integrate into a deployment process.

Then I came across DbUp.  DbUp is great because it’s simple, uses the transitional approach and is easy to integrate into a deployment tool like Octopus Deploy.  What it lacks in features it more than makes up for in doing 95% of the things I care about well.  However, there were a couple of things it lacked which I thought could really make it perfect: (1) Package Manager Console scripts and (2) Object Scripting.

Package Manager Console scripts

I thought if there were a couple of simple Package Manager Console scripts, (1) one to create a new timestamped migration script, mark it as an Embedded Resource, and add it to the project and (2) a script to run the DbUp console application and migrate the database, the DbUp process would be much more streamlined.

Object Scripting

When I presented DbUp to my team at work and pitched it as a viable option for our database migrations, the biggest piece of feedback I got was the need to have object definitions scripted at the time the migrations run.  With Active Record Migrations in Rails, when you run db:migrate, after the migrations are applied, db:schema:dump is automatically called which updates your db/schema.rb file.  This file has the object definitions for your database.  As you run migrations, your schema.rb file gets updated which enables versioning of your schema, clear diffs for pull request reviews and a collection of baseline scripts to build a new database from scratch.  I realized if DbUp could support scripting of changes objects alongside running migrations, it would really be a great solution.


So, I wrote some Powershell scripts and leveraged SQL Server SMO to script out object definitions when DbUp migrations are run.  I cleaned it up and packaged it as a NuGet package for others to use.  I have been using this solution for a few months and have to say it is really ideal.  The workflow is:

  1. Run New-Migration from the Package Manager Console.  A new .sql script is added to your DbUp project in the \Migrations folder.
  2. Edit the new migration script and write the SQL to migrate your database.
  3. Run Start-Migrations from the Package Manager Console.  The output will show that your new migration(s) has been run.  Also, you will notice that any object(s) in your migration script(s) that were changed are also scripted and saved to the \Definitions folder at the root of your project.

That’s it!  With the above 3 steps you were able to migrate a database and update a local definition representing your database object state.  This is very similar (if not exactly the same) as Active Record Migrations.  When it comes time to run migrations on another environment, you only need to run the DbUp console application, just as the DbUp documentation describes; it’s just a console application after all.


This demo video shows just how easy it is:

dbup-sqlserver-scripting Demo


More Information

The NuGet package is located here: and can be installed by running Install-Package dbup-sqlserver-scripting.

The GitHub repo is located here: and contains more details setup and usage information.


Share Razor Partial View between WebForms and MVC

ASP.NET MVC is a breath of fresh air for anyone with a background in ASP.NET WebForms. It’s cleaner, supports the Razor view engine, is much(!) easier to test, doesn’t have the nasty viewstate baggage and generally just feels better. It’s pretty great that you can use it in older webapps as well by Integrating ASP.NET MVC into an existing ASP.NET Webforms application. That’s pretty cool but when you do this, you’ll inevitably have view content that you need to share between WebForms and MVC pages. A perfect example of this is a navigation bar or page footer in MasterPages / MVC Layouts. It’s tempting to assume these two ASP.NET paradigms don’t play together and to just have a Razor version and a separate WebForms (ASPX) version that are synced up manually.

However, they can play together and you can share view content between them if you use a Razor partial view and some bridge code to shim out a wrapper around the WebForms/ASPX HttpContext. The technique was posted on Stack Overflow here and I have been using it successfully in some projects. I consider it a pretty big deal that you can do this as it makes migrating to ASP.NET MVC much more feasible in a legacy web application. I’ve gisted an example to show how’s it’s done:

My Free HDTV with DVR Setup

For the last 9 years I have been cable free and using OTA (over-the-air) HDTV in conjunction with a free DVR solution.  I originally decided to go this route because I thought it would be a fun process to get everything setup but also because I wanted to save money.  If the average cable bill is $80 a month, I have saved almost $9,000 over the last 9 years ($80 x 12 months x 9 years = $8640).  That’s a lot of money folks.

Not only does OTA HDTV save lots of money, I would argue that the quality is the best quality picture you can get for television.  When I compare my picture to a friend’s cable or satellite picture I think mine is superior, easily.  Another compelling reason to go this route today is that more and more quality content is available over streaming services like Netflix, Hulu and Vudu so you don’t have to have cable so you can watch that must-see show.  And for the sports fanatic that can’t live without ESPN, you can now get Sling TV to scratch your itch of live sporting events without cable.

I’ve had a handful of friends ask over the years what my setup is so I thought I would document it here.  It has changed over the 9 years but I feel my current setup is the best and the one I can recommend.

The Antenna

Audiovox RCA ANT751

Audiovox RCA ANT751

First of all, you need to get a good antenna.  This is a foundational piece and you’ll want to make sure you get a good antenna and pointed at the best direction.  There are lots of options here: from bunny ears, slimline wall mounts, discreet table-top, directional, omnidirectional, indoor, outdoor, etc.  Lot’s of options!  The are many factors to consider for an antenna such as directional displacement of your house from the TV stations, obstacles like trees and signal strength.  I’ve tried about 5 different options over the years and will hands down recommend an outdoor (or attic) directional antenna.  If you are serious about an OTA setup, I say go with an outdoor directional antenna because it gives a great stable picture and you won’t have to worry about constantly adjusting it to tune in a station.  The one I have is the Audiovox RCA ANT751 from Fry’s.  It’s $79 and well worth it.

This is how my mounted antenna looks:


It’s pretty discreet, especially when mounted near other wires and such coming into my house from the utility poles.  It’s less discreet than my multi-colored gate anyway.

Once you have your antenna, you need to mount it and point it in the right direction.  A great resource for determining which direction to mount it is a website called AntennaWeb –  You simply type in your address and it will give you a map of all the TV station signals in your area and the compass degrees of each.  My map looks like this:

AntennaWeb 77018

download (1)

Compass App on Google Play Store

As you can see, most of the stations are between 195° and 196° degrees.  This is where I need to point my antenna.  Grab your smart-phone and get a compass app from your app store.  In my case, I use Android so I downloaded “Compass” from the Play Store.  It’s a simple app and will do the job of helping us point in the right direction.

When I mounted my antenna, I took my phone up with me, calibrated the compass and then placed it right on top of the antenna.  I was then easily able to rotate it left/right until the compass degree marking matched exactly what I wanted.  Then, I tightened the clamps and was good to go!


Obviously, there is a bit more work to do, such as running the coaxial cable to your TV.  I like this kind of stuff and if you do too, it shouldn’t be too hard to tackle.  However, if you don’t feel that mounting the antenna is something you can do, just pay someone else to do it.  It shouldn’t be that expensive.


Free OTA HDTV is great but the year is 2015 and a DVR is required, my friends.  Not being able to record or pause live TV would be a deal breaker and would fail the WAF (wife-approval-factor).

Again, just as there are many antenna options, there are many DVR options.  The cadillac, best-of-breed option here is the TiVO Roamio which costs about $200, depending on the version.  It’s got a very nice user interface and works well, but unfortunately it required a monthly subscription fee.  It will cost between $12 and $15 a month unless you shell out a whopping $500 for a lifetime subscription.  Sorry, that’s a deal breaker for me.

My initial setup was a Windows Media Center PC.  This is a special version of Windows that can receive HDTV signals through one or more TV tuner expansion TV cards.  It has a slick user interface and can do all the things you would expect for a DVR to do (and much more!).  Best of all, there is no monthly subscription charge.  Windows Media Center was my solution for the last 8 years and it worked well, I must say.  However, there is quite an investment in getting it setup.  Not only do you have to purchase a full-blown computer and licensed copy of Windows, you have to buy and install one or more TV tuner cards.  There is lots of tweaking of settings to get it just right.  And then there is the video/audio connection to your TV you have to work out.  Most video cards these days will support a HDMI out connection to your TV but many will not support a high quality, 5.1 digital audio stream through the same cable.  At least not in my experience.  I was able to get mine working (video out over HDMI, audio out through a optical cable from expansion 5.1 sound card) but let me tell you it was a lot of work and required babysitting over the years.  It was this part of my setup that made me hesitate when recommending it to friends.  I know this was the biggest obstacle to adoption.

However, there is a new option out there which I’ve been using for the last 6 months.  It’s called the Channel Master DVR+.  It’s a very small block box that is as simple as it is nice to look at.  Just look at it:



It’s nice to look at and will look great next to your HDTV.

You place it next to your TV, plug in the AC adaptor, hook your coaxial cable from your antenna in the back, connect it to your TV with an HDMI cable, connect to your router via ethernet or wifi adapter, and attach an external USB hard-drive.  That sounds like a lot but it’s very simple to setup.


It costs $249 and you need to purchase an external USB hard-drive to hook up to it which will set you back another $60.  Beyond this initial cost, there is no monthly subscription fee!  It’s simple to setup, simple to use, fairly cheap (all things considered), and has no monthly cost.  This is the solution I’ve been waiting for.

My one non-deal-breaker beef with it is the remote control.  It’s thin and feels flimsy.  It’s not a deal breaker but just feels cheap in my hand.


This picture paints it in a nice light but trust me, it’s not the best.  I’ve even considered buying a programmable remote to replace it with but again, it’s not a deal breaker.

Rather than exhaustively reviewing the Channel Master DVR+ here, I’ll point you to some great reviews elsewhere that some fine folks spent a lot of time on:

I should point out that some of the reviews mention the inability to record only “new” episodes rather than all episodes of a show.  They have released a software update in the last couple of months that enables this which works great!


There you have it.  That’s my free HDTV with DVR setup.  There is a bit of an upfront investment in time and cost with getting a good antenna installed and purchasing the DVR but I feel the payoff is huge.  Remember how I estimate I’ve saved almost $9,000 over the last 9 years?

This setup will allow you to watch over-the-air television, of which there are some great shows out there.  If you have cable, it’s probably hard to remember life before it but I can tell you that the major networks still produce some fine shows; enough to keep you entertained.  And after that, you can load up Netflix, Hulu, Vudu, Google Play Store, iTunes, etc. (etc.!) to be entertained even more.

I hope you’ll give free OTA HDTV a try because I don’t think you will be disappointed.