Category Archives: Hardware

How to Create a Simple Backup Solution That You Can Trust

Backing up your data is really important.

We’ve all heard too many stories of hard drives crashing or computers getting lost or stolen without having a backup and their owner’s suffering a horrible loss of irreplaceable data.

didnt backup data How to Create a Simple Backup Solution That You Can Trust

Why didn’t I backup my data?

So, if we all know that backing up data is so important, why don’t we do it?

Well, some of us do, but I know that a majority of software developers I talk to are either not really doing good backups at all or are doing what I would call a half-ass job.

The reason for this is simple: Coming up with a good backup solution is difficult–or at least it can appear that way.

That’s why I am writing this blog post. I want to make it as simple as possible.

I’m going to show you a simple approach to create a backup strategy–not just a solution–that you can easily implement.

The basic approach will be as follows:

  • Reduce the amount of “stuff” that needs to be backed up
  • Divide backed up data into two categories: critical and non-critical
  • Have three copies of data, two local and one offsite
  • Make everything automated

Step 1: Reducing the amount of “stuff” that needs to be backed up

The easiest way to simplify your backup solution is to start by reducing the amount of stuff that you need to backup. The less you have to backup, the easier it will be to manage those backups and to actually do the backups themselves.

throwing away How to Create a Simple Backup Solution That You Can Trust

If you don’t need it, don’t store it

What we want to do is go through all the data that we think we need to backup and try to get rid of as much of it as possible.

Most people I know, especially software developers and IT people, are storing all kinds of stuff that they will never need.

I used to rip all kinds of movies, video games and music to my computer and save them in a huge library so that I would have access to all this stuff digitally if I ever needed it.

Guess how often I actually needed some random movie I already watched, a video game I already beat or a book I already read?

Just about never.

Now, I realize that some people actually do use their huge libraries of media. For example, kid movies often get watched multiple times, but you have to admit that there is probably a lot of stuff that you will never ever touch again.

I tried to purge as much stuff that I know I will never likely touch again as possible.

I know you might be resistant to doing this, but let me try and convince you that this is a good idea, then you can ignore me and back it all up if you want to.

First of all, think about how easy it is to rent digital content on demand today or buy something off of Amazon or Ebay. Do you really need to store a copy of “A Night at the Roxbury?” Probably not, if you ever want to watch it again, just pay a few bucks and get it here.

If you watch a lot of movies, you’ll be much better off having a subscription to Netflix than you will trying to store a copy of every movie you’ve ever had your hands on. Think about how many hours you are wasting ripping movies to disk and meticulously organizing them. How many of those movies do you actually watch?

The same goes for music, books and video games. Most things only get consumed once. Get rid of as much of this stuff as possible. You’ll not only stop wasting time storing all this stuff, but your backups will be easier and you will find that a huge mental load is lifted from you.

Even huge music collections are mostly a waste of time. There are multiple music services you can subscribe to that will give you access to just about any music you want for a low monthly fee.

Plus, this trend is only going to increase. More and more stuff is going to be available from the cloud, on demand, for a small rental fee or monthly charge.

Stop saving all that crap.

Step 2: Divide backed up data into two categories: critical and non-critical

Backing up one terabyte of data to the cloud takes a long time and it can be expensive–that is why most people don’t do it.

So, what do you end up with?

Well, if you are like most people, you end up having some kind of local backup and you don’t really have a good cloud or offsite backup in place. It’s just too much trouble to try and backup all that data to the cloud.

If you followed my first step, you should be well on your way to reducing your total data that you need to backup, but we can do much better and get that data to an even smaller amount.

Instead of trying to backup everything to the cloud or offsite, if you focus on backing up just what is critical, you’ll find that it is much more manageable and you won’t need gigabit internet to back everything up.

Take all the data that you want to backup and sort it into two categories: critical and non-critical.

Critical things are things that if you ever lost them, you would be very sad, because they couldn’t be replaced or would cause you some great harm.

A good example of critical data for me is my wife’s photos. If I lost my wife’s photos, I would probably need to find a new wife.

divorce How to Create a Simple Backup Solution That You Can Trust

If you loved me, you would have had a cloud backup

Other critical data for me is current projects I am working on and past projects that I may need to access again at some point in the future.

My Pluralsight courses and other training courses are critical data. My source code for my applications is critical data.

The opposite of critical data is non-critical data–duh.

But what is non-critical data?

It’s data that would suck to lose, but would not be the end of the world. Perhaps data that would be a small inconvenience to you to lose, but could be replaced.

Collections of movies, video games and music fall into this category. Yes, you’ll be disappointed if you lose this data, but you can replace that data even if it might cost you some money.

Now, before you get all uppity about your movie collection, remember, I’m not saying we aren’t going to backup your non-critical data–we will–it’s just that we aren’t going to back this data up to the cloud or offsite.

Other non-critical data might be an image of your computer or development workstation. If you lose that backup, you might have to re-install your operating system or waste some time re-installing other programs, but it won’t be that big of a deal.

Most of your data should be non-critical. Unless, of course, you got rid of a large amount of that data, because you realized the futility of storing digital copies of stuff you won’t ever use again. But, if I haven’t convinced you by now, I probably never will, so we’ll just call your “The Complete Matrix Trilogy (The Matrix / The Matrix Reloaded / The Matrix Revolutions) [Blu-ray]” non-critical data.

Step 3: Have three copies of data, two local and one offsite

Ok, now we are actually ready to back things up.

Most likely, you’ll have a small amount of critical data and either a larger amount of non-critical data or almost none.

The critical data we don’t ever want to lose. So, we need to make sure that there are three copies of that data at all times.

cloud backup How to Create a Simple Backup Solution That You Can Trust

Three copies of data, two local, one offsite or cloud

The easiest way to do this is to have:

  • one working copy
  • one local backup
  • and one cloud backup

Today, this is actually quite simple to achieve.

For awhile I was doing this by using a service like CrashPlan. Crash plan allows you to specify folders on a computer to backup to another location and to specify some folders to be backed up to the CrashPlan cloud servers as well.

I was creating two backup sets. One that backed up my critical data to another hard drive in my computer and another backup that backed up to CrashPlan’s servers.

This worked well for a while, but then I realized that I didn’t really need to pay CrashPlan’s monthly fee when I was already paying for extra storage space with Dropbox and also that I wanted to have a central place to backup data locally and not just back up from my one PC. My wife has data she needs to backup and I have a laptop and other devices as well.

Now, don’t get me wrong, CrashPlan is great. I highly recommend it, but if you have a NAS (Network Attached Storage) and an account with either Dropbox or OneDrive, you might not really need to utilize a service like CrashPlan.

But, before I get into the specifics of what I am doing, let’s talk about the strategy one more time.

We want to have three copies of our data. One working copy, one local backup and one offsite backup.

There are many ways to accomplish this; the easiest way is to have a cloud storage solution like Dropbox or OneDrive as an offsite backup and then to figure out some way to have a local backup as well.

The reason why we want an offsite backup and a local backup is so that we are covered in two possible scenarios:

  1. Your local backup fails and when you go to recover your data you discover this problem. In that case you can just get the data from the cloud.
  2. Your cloud backup either fails, goes out of business or loses your data. In this case, you can use your local backup to recover your data and move your offsite backup to another service.

If you just put your data in the cloud, it isn’t good enough, because you are relying completely on someone else’s service that you can’t control.

If you just locally back up your data, it isn’t good enough, because you could have a fire and your entire house could burn down, or you could be robbed, or your backup could just fail and you not know it.

How I’m backing up my critical data

So, how am I actually backing up my critical data now that I’ve gotten rid of my CrashPlan subscription?

Well, I invested in a Synology NAS, or network attached storage.

I bought a Synology DiskStation 2-Bay (Diskless) Network Attached Storage (DS213j) which I have attached directly to my network.

It allows any computer in my network to use it as a file server and it runs its own little operating system that can do all kinds of neat things like backup my data to Dropbox or even Amazon Glacier.

There are two big advantages of this kind of device versus having some hard drives in my computer that I am using to create a copy of data:

  1. The data is accessible by all the computers and devices in my house easily. I don’t have to have my main PC on and connected to the network.
  2. The Synology devices very easily do a RAID style storage. So, I can have two hard disks in there and if one of them fails, the other one still has all the data.

A distant third, for me, would be that the Synology box can act as a full media server. I don’t use that functionality that much, but I know some people with huge movie collections do.

The big key point for my solution is that the Synology device creates a very good redundant RAID. In my book that counts as two copies of local data.

I just attach my Synology box as a network drive on my computers and devices and I store any data that I want to make sure is backed up there.

Synology has a service you can install that hooks up your device with your Dropbox account (OneDrive support coming really soon.) So, I just share out that Dropbox folder on my Synology drive out to my network and I can drop any files I want backed up in that share and those files will not only be in double backup on the Synology drive, but also backed up to my Dropbox account in the cloud.

My wife can also do the same, so this is very convenient.

I also don’t even worry about having a local copy of a backup on my computer anymore, because I know that the data in the Synology drive is backed up on two hard drives and in the cloud.

How I’m backing up my non-critical data?

What about the non-critical data?

Simple. I just copy that to a share on the Synology device that isn’t backed up to the Dropbox account.

For example, right now, the only thing I really have that I am considering non-critical data is images of my computers. I just put those on a share on the Synology and I don’t worry about them.

Why not just use a backup service like CrashPlan or Mozy?

Again, this whole thing could be done with a backup service like CrashPlan, but if I am already going to have a Dropbox account that I use, I don’t see the point of paying for and managing another backup system.

In fact, I am probably going to switch over to OneDrive exclusively, because Microsoft just recently announced that Office 365 users get unlimited OneDrive storage.

If you want to use a backup service though, go ahead. Just make sure you have a local backup and a cloud based backup for your critical data.

Step 4: Make everything automated (and test it)

If you followed my backup plan or you are using a service like CrashPlan, then you probably don’t need to do much here, because everything is already automated.

CrashPlan automatically backs up the data on your computer as it changes, so you don’t really have to worry there.

And, if you are using a NAS and a service like Dropbox, that is automated as well, because you basically just drop files into the Dropbox folder on your NAS and it automatically syncs with Dropbox.

But, if you are doing something else, just make sure the entire process is automated. You might, for example, want to automate backing up images of your computer. Or you might want to automate getting photos off of your phone or camera and dropping them into a backup location.

Finally, make sure you test everything out.

A backup that has never been tested is worthless.

If you use a service like CrashPlan, try restoring data from it.

If your backup is going to be your Dropbox box and your NAS, test it out. Make sure you can retrieve any data that you need. If you are backing up databases or snap-shotting PCs, make sure you can restore all of those backups, otherwise don’t bother backing them up in the first place.

Nothing is worse than trying to restore a backup and finding out that it was no good or wasn’t working.

Are you backing up your data?

Let me know in the comments below.

And, if you liked this post and found it helpful, join the Simple Programmer community so that I can stay in touch and let you know when I have new posts or other free content you might be interested in.

Also, if you have some other suggestions or think there is something I missed or didn’t consider, leave a comment and let me know.

 

 

Seiki SE39UY04 39-Inch 4K Ultra HD Review (3840 x 2160)

Update: Apparently the monitor is now $400 on Amazon! Holy crap that is a hell of a deal!

A couple of weeks ago I tweeted about how I bought a 3840 x 2160 resolution monitor for $700 $400 and I promised that I would provide a review for it, so here it is.

If you know me, you know that I am a huge fan of having as many pixels as possible. I actually replaced my four 1900 x 1200 monitors with this single Seiki 4K display—and, although I hate to ruin the surprise—I couldn’t be happier.

It is a huge relief to go from a bulky four monitor setup to a single monitor that has almost the same resolution as all four of those monitors combined, and actually makes me more productive. (I’ll explain why this is the case a little later on.)

110613 1317 SeikiSE39UY1 Seiki SE39UY04 39 Inch 4K Ultra HD Review (3840 x 2160)

The stats please

Before I get into my actual experience with the Seiki 4K display, let me start off by giving you the technical details of what we are dealing with here. This is no ordinary monitor.

The Seiki SE39UY04 is actually called an LED TV, but you and I both know that this thing is really meant to be a monitor.

It actually comes in 3 sizes.

  • 39 Inch
  • 50 Inch
  • 65 Inch (looks like this one isn’t available yet.)

I opted for the 39 inch for both economic and space reasons, but I’d imagine the 50″ or 65″ would be nice as well—if you have the budget.

So, the big thing about this monitor is the resolution, which is 3840 x 2160 pixels. This means that is it basically the same resolution as four 1080p displays.

110613 1317 SeikiSE39UY2 Seiki SE39UY04 39 Inch 4K Ultra HD Review (3840 x 2160)

You can imagine this monitor as equivalent to having a 2 by 2 grid of 1080p resolution monitors.

It also comes with built in speakers which aren’t all that bad, 3 HDMI in ports, a component input and a VGA input – although I can’t imagine why you’d hook up a VGA input to this monitor.

110613 1317 SeikiSE39UY3 Seiki SE39UY04 39 Inch 4K Ultra HD Review (3840 x 2160)

This monitor indeed seemed too good to be true, giving me 8 million pixels at $700 $400, but it is indeed the real deal.

Well, with one catch.

The monitor has a 120hz refresh rate, but only when hooked up to a 1080 source. If you hook it up to a 4k, or Ultra HD source (same thing, just means 3840 x 2160 resolution,) it only runs at 30hz.

This is mainly due to the limitation of HDMI 1.2 to deliver the bandwidth needed to push 8 million pixels down a wire that fast.

What this practically means for you is that you will see some slight ghosting if you have images moving fairly rapidly across the screen. So, if you are planning on playing 3D games with the monitor, you’ll probably feel the difference between 120hz and 30hz, but if you are using it to run an IDE, open 2 chrome browsers and create a blog post all at the same time, you probably won’t feel a thing.

If you want to see exactly what the difference between different refresh rates is and how they are affected by motion, check out this site that lets you simulate different scenarios.

My experience with the SE39UY04

So, you are probably wondering what I thought of the monitor overall.

I’m very impressed with it so far. So much so that I am considering getting another one and having dual 4k displays for my desktop machine. (Ok, that might be a bit crazy, but hey I never claimed to be sane.)

In all seriousness, this monitor delivers exactly what I wanted and more for the $700 $400 price tag. Equivalent monitors easily go for $3k or more and are still saddled with some of the same restrictions of HDMI 1.2 that this monitor is, (although some of them get around it by utilizing two inputs, which in my opinion isn’t really a good solution.) With HDMI 2.0 coming out and through the use of display port, I imagine this kind of problem won’t be a problem in the future as well. But, for now, for $700 $400, you can’t really get a better deal on a monitor, in my opinion.

I found the monitor to be super bright, but the color reproduction and contrast was not quite as good as some of the high-end 1080p displays that I had seen and used before.

Regardless though, it is pretty amazing to see a 4k monitor in action. I tested out some 4k videos on YouTube and Vimeo and I was blown away. Remember the first time you saw an HD TV? Seeing a 4k TV for the first time is about the same experience, but even better. Now HD looks like crap to me. I am ruined.

I also found that running my PC at the max resolution of 3840 x 2160 didn’t make the text too small. It seemed to be just about the right size where I could still read everything and make use of the extra space. Although, I will admit that I think the extra size of the 50″ version would make this experience even better.

One quick tip I’d recommend to get the most out of this monitor is to use a tool like Display Fusion to divide your screen into 4 logical screens. I use Display Fusion to create four 1080 displays out of my monitor so that I can maximize windows in those four quadrants as if I had four 1080 monitors. But, unlike having four actual 1080 resolution monitors, I can choose to maximize a window on two of the four quadrants or even all four if I wish. So, I have the ultimate flexibility with this monitor and no annoying seams between the displays.

Oh, and if you do any kind of video editing, like I do, it is pretty amazing to be able to edit full 1080 video and only have it take up a 4th of your display. You can actually see the full resolution video and the timeline at the same time. I don’t think I’ll ever be able to go back after this experience.

Video cards

One thing you should be aware of when purchasing this monitor, or any 4k monitor, is that you will need to have a video card that will support that resolution.

The general rule is:

  • Any AMD ATI card in the 7000 series or above will support 4k output.
  • Any NVIDIA card in the GTX 600 and GTX 700 series should support 4k output.

If you want an ATI card that isn’t too expensive, start with the HD 7700.

For NVIDIA, check out the GTX 660.

Final thoughts

Overall, for me, buying this monitor was a no brainer. My biggest fear was that there was some kind of mistake and that the resolution really was not 3840 x 2160 or that some graphics card wouldn’t be able to power that kind of a display, but I found that not only were most modern graphics cards able to power the display, but it also was indeed the real deal.

Now, I realize that not everyone has a $700 budget to spend on a monitor. But, did I mention this thing is actually a 4k TV as well? $700 $400 for a 4k display television is not a bad deal at all, especially if you can convince your spouse to let you put it in the living room. I also did the math and found that if I wanted to purchase four 1080 displays, the mounting arm to hold them, and the extra video card to power all 4 monitors, it would cost quite a bit more than the $700 $400 for a single Seiki SE39UY04 and take up quite a bit more space.

So, if you’ve got the cash and you are looking to be drowned in pixels, go for it, you won’t be disappointed unless you are planning on doing some serious heavy duty gaming with it, (even then I thought it looked plenty good enough for my tastes.)

Leaky Abstractions Are Holding Us Back

Let’s just get right into it, shall we?

What is an abstraction?

Before we can talk about leaky abstractions, and why they are bad, let’s define what exactly an abstraction is.

An abstraction is when we take a complicated thing and we make it simpler by generalizing it based on the way we are using or talking about that thing.

We do this all the time. Our brains work by creating abstractions so that we can deal with vast amounts of information.

abstraction thumb Leaky Abstractions Are Holding Us Back

We don’t say “my Dell Inspiron 2676 with 2 GB of RAM and a 512 GB hard drive, serial number 12398ASD.” Instead we say simply “my computer.”

In programming, abstractions let us think at higher levels. We create abstractions when we give a bunch of lines of code that get the username and password out of a textbox and log the user into our system, a name; login. And we use abstractions when we utilize APIs or write to the “file system.”

Without abstractions we’d have a really hard time managing the complexity of any software system. Without abstractions, it would be pretty difficult to write any kind of non-trivial application.

If you want to feel what it is like to program with fewer abstractions, try serving up a web page with assembly code. If you want to feel what it is like to program with almost no abstractions, do the same exercise, but write your code in machine code… 0s and 1s.

How do abstractions “leak?”

Abstractions leak when you have to understand the lower level concept to use the higher level concept.leaking thumb Leaky Abstractions Are Holding Us Back

This means the details of the lower level concept that are being abstracted away are leaking up through the higher level concept.

Ever get a cryptic error message when an application you are using crashes? That is a leaky abstraction.

“Memory location 00FFE133 could not be written to” is just bad news. It means that in order to understand what is going wrong with your application, you now have to understand the inner workings of the application. Your abstraction has effectively “leaked.”

Abstractions leak whenever you are forced to pull back the curtains and see the gnomes running in hamster wheels spinning the cogs.

Why “leaking” is bad

I’m often surprised when I hear someone say “I like how you didn’t hide the details of what is really going on” about some code or some technology.

Not being able to hide the details of what is really going on points to an inability to create the proper abstraction, which in my opinion, should be water tight.

It is not a good thing that we can see what is really going on. No, it is actually a very bad thing, because it forces us to think at the higher level some of the time, but have to switch gears to drop down to the lower level some of the time.

Leaky abstractions prevent us from being able to comfortably drive the kids to school; we have to constantly keep looking in our rear view mirror and checking to see if those rowdy kids who insist on sitting at the back of the bus are causing trouble again.

A good abstraction hides the details so well that we don’t ever have to think about them. A good abstraction is like a sturdy step on a ladder, it allows us to build another step above us and climb up it. A leaky abstraction is like a wobbly rotted wood rung; put too much weight on it and you’ll fall through—it traps you at the current step—you can’t go any higher.

Shouldn’t we understand how things really work?

No! Stop already! Enough with the guilt trips and unnecessary burdens. You are defeating the purpose of abstractions and stunting further growth.

You only “need” to know the details if you want or need to know the details.

I know that many people want to know how things really work, myself included for many things, but in many cases I don’t need to understand how things really work.

For example, I can buy and use a watch without understanding how the watch works. It really is better that way. I don’t need those details floating around in my head. I get no benefit from knowing what is going on inside my watch and frankly, I don’t want to know.

Personally, I have a similar view on cars, although I know that many other people don’t, so that is why I didn’t use it as my first example.car engine thumb Leaky Abstractions Are Holding Us Back

When I buy a car, I just want to drive it. I don’t want to remember when to rotate tires, check fluids, etc. Waste of my time. I don’t want to know how the engine works or anything about it besides I have to put gas in, turn a key, and press a few pedals to make the thing go. I know I could save money servicing my own car, but you know what? I’d rather pay money than have to do it myself. And since that is the case, as long as I can find someone I trust to work on my car, I don’t want that abstraction to leak.

The same with machine code and compilers. Yes, it is interesting to understand how memory mapping and CPUs and logic gates work and how my C# code gets compiled into MSIL which is interpreted by the CLR and so forth and so on, until it ends up as electrons operating on registers in my CPU. And I learned about how much of that works, because I was interested, but let’s be completely honest here. That abstraction is really good. It is so good and so air-tight, that I don’t need to know all of that crap in order to write a web app. If I did, I’d probably find another profession, because my brain would explode every time I tried to write the simplest piece of code.

By the way, if you do want to understand what is happening behind many of those abstractions, I highly recommend Charles Petzold’s book: “Code: The Hidden Language of Computer Hardware and Software”.  (This is the kind of book you read for “fun,” not because you need to know this information.)

Leaking abstractions in technology

Too many technologies and frameworks today are leaky, and this is bad– really bad.

Want an example? Whatever happened to ORM (Object Relational Mapping?)

Nothing. Nothing happened. It was invented, oh, like thousands of years ago and while lots of other technologies and frameworks grew up and got jobs and left the nest, ORM is still a fat unemployed balding 30 year old man playing WOW all day in his mum and dad’s basement.

Don’t get me wrong. I use ORMS; I sometimes like to think I like ORMS. But, I can’t use them well because I have to understand SQL and what is going on in my relational databases anyway. What good is that?!

Seriously. This is a completely failed abstraction. The leakiness of it makes it actually more work to learn an ORM framework, because in learning one you must also learn hairy details of SQL and understand what N+1 means and so on. You can’t just fire up your ORM and put in the Konami code, you have to blow the cartridge out and fiddle with the reset button half a dozen times. Useless.

The other major example that is holding us back today is MV* in JavaScript. Look, I’m not trying to bash JavaScript this time. I’m actually starting to like JavaScript, but it is total bullcrap that in order to write a single page web application today, you have to understand 500 other different technologies and how JavaScript is working under the covers. If we are going to do MV* in JavaScript– and I am actually starting to become a believer in this truth—then we gotta seal up the abstraction and put what we need into the language or the DOM or the technology stack itself. Bolting things on to create abstractions that are ultimately leaky works for now, but it is like that rotten creaky step in the ladder, eventually we are going to fall through it and we aren’t going to be able to climb any higher until we replace that step.

I used to be convinced replacing that step meant getting rid of JavaScript and using something else superior that does what we want, but maybe it just means making JavaScript and browser APIs themselves evolve. I don’t care how we do it, but we aren’t going to advance until we do.

I could go on and on with examples of leaky abstractions in technology and how they are holding us back, but I think you get the point by now.

Leaky abstractions are bad. They don’t do anything good for us besides require us to know more and add more points where things can go wrong. We’ve got to seal up these abstractions, if we want to be able to keep climbing the software development ladder. You can’t keep building a tower if you have to worry about every single block beneath you. You eventually get to a point where you can’t keep it all straight and your tower comes crashing to the floor.

Leaky abstractions and you

So, what can you do about it?

Not much as far as ORMs and JavaScript. But, next time you write some code, seriously think about whether or not you are creating a leaky abstraction.

You really have to make a choice. Am I going to hide the details so that no one needs to know about them, or am I going to expose the whole thing? Either make a really good abstraction that doesn’t require someone using your code to understand what is going on underneath that abstraction or don’t attempt to make an abstraction at all.

It takes more effort to create good abstractions. Sometimes the effort required is orders of magnitude greater. But, if other people are going to use your abstraction, to save the mental burden of understanding what is happening beneath it is going to go a long way in making up for your extra effort.

10 Steps to learn anything quickly course

By the way, I just released by 100% 10 Steps to Learn Anything course.  If you want free access to it, just signup to my newsletter here.  It is a totally free video course and optional workbook that reveals the process I use to quickly learn technologies.

Developing For Leap Motion in C#

I know I just posted about my last course.  But, I’ve got another one out that I did jointly with the my Get Up and CODE cohost and Swedish coder, Iris Classon.

Developing for Leap Motion in C#

This course was lots of fun.  Iris and I actually pair programmed over Skype using Join.me to share our screens for part of the course.

I really think devices like the Leap Motion have large amounts of potential and it is just a pretty cool technology.  Not too hard to program for either.

2013 10 12 15 49 401 Developing For Leap Motion in C#

Here is the official description:

In this course, you’ll learn how to create a complete WPF application with the Leap Motion controller. The Leap Motion is a new type of user interface device that allows for very precise tracking of up close motions. The Leap Motion opens up the possibility for creating completely different kinds of applications that are able to be controlled by fingers, hand gestures and even tools like a pencil.

This course will teach you everything you need to know to get started developing application for Leap Motion in C#. First you’ll learn a bit about motion tracking in general and how the Leap Motion works.

We’ll discuss how the Leap Motion device is unique from many other motion tracking technologies.

Then, we’ll go through the basics of the Leap Motion itself and you’ll learn how to get started and setup your development environment for developing a Leap Motion application.

After that, we’ll take you through the process of creating a real WPF application that uses the Leap Motion controller for tracking movement.

You’ll learn how to use the Leap Motion SDK to create code to track individual finger movements and gestures and how to map those movements to screen coordinates to control an object on the screen.

Finally, we’ll take you through the process of bringing your Leap Motion application to the masses as we show you how to deploy your application to the Leap Motion Airspace store.

By the end of this course, you’ll have built a complete application that can be controller with a Leap Motion controller.

Time Traveling To The Future Of User Interfaces

I really dislike using a keyboard and a mouse to interact with a computer.

keyboard thumb Time Traveling To The Future Of User Interfaces

Using a mouse is a more universal skill—once you learn to use a mouse, you can use any mouse.  But, keyboards are often very different and it can be frustrating to try and use a different keyboard.

When I switch between my laptop and my desktop keyboard, it is a jarring experience.  I feel like I am learning to type all over again.  (Of course I never really learned to type, but that is besides the point—My three finger typing style seems to work for me.)

laptop thumb Time Traveling To The Future Of User Interfaces

When I switch to a laptop, I also have to contend with using a touchpad instead of a mouse, most of the time.  Sure, you can plug in a mouse, but it isn’t very convenient and you can’t do that everywhere.

I also find that no matter how awesome I get at keyboard shortcuts, I still have to pick up that mouse or use the touchpad.  Switching between the two interfaces makes it seem like computers were designed for 3 armed beings, not humans.

Even when I look at a laptop, it is clear that half of the entire design is dedicated to the keyboard and touchpad—that is a large amount of wasted space.

I’m not going to say touch is the answer

You may think I am going in the direction of suggesting that tablets solve all our problems by giving us a touch interface, but that is not correct.

Touch is pretty awesome.  I use my iPad much more than I ever thought I would.  Not having the burden of the keyboard and mouse or touchpad is great.

But, when I go to do some text entry on my tablet or my phone, things break down quite a bit.

On-screen keyboards are pretty decent, but they end up taking up half of the screen and the lack of tactile feedback makes it difficult to type without looking directly at the keyboard itself.  Some people are able to rely on autocorrect and just let their fingers fly, but somehow that seems dirty and wrong to me, as if I am training bad habits into my fingers.

touch thumb Time Traveling To The Future Of User Interfaces

Touch itself is not a great interface for interacting with computers.  Computer visual surfaces are flat and lack texture, so there is no advantage to using our touch sensation on them.  We also have big fingers compared to screen resolution technology, so precision is also thrown out the window when we relegate ourselves to touch interfaces.

It is completely silly that touch technology actually blocks us from viewing the part of the screen we want to touch.  If we had greaseless pointy transparent digits, perhaps touch would make the most sense.

Why did everything move to touch then?  What is the big thing that touch does for us?

It is pretty simple, the only real value of touch is to eliminate the use of a mouse or touch pad and a keyboard.

Not convinced?

I wasn’t either, till I thought about it a bit more.

But, consider this… If you were given the option of either having a touch interface for your tablet, or keeping the mouse-like interface, but you could control the mouse cursor with your mind, which would you prefer?

And that is exactly why touch is not the future, it is a solution to a specific problem, the mouse.

The real future

The good news is there are many entrepreneurs and inventors that agree with me and they are currently building new and better ways for us to interact with computers.

Eye control

This technology has some great potential.  As the camera technology in hardware devices improve along with their processing power, the possibility of tracking eye movement to essentially replace a mouse is becoming more and more real.

There are two companies that I know are pioneering this technology and they have some pretty impressive demos.

TheEyeTribe as an “EyeDock” that allows for controlling a tablet with just your eyes.

2013 03 03 13 31 09 thumb Time Traveling To The Future Of User Interfaces

They have a pretty impressive Windows 8 tablet demo which shows some precise cursor control using just your eyes.

Tobii is another company that is developing some pretty cool eye tracking technology.  They seem to be more focused on the disability market right now, but you can actually buy one of their devices on Amazon.

The video demo for PCEye freaks me the hell out though.  I don’t recommend watching it before bed.

But Tobii also has a consumer device that appears to be coming out pretty soon, the Tobii REX.

tobii rex le picture montage backfront619x220 thumb Time Traveling To The Future Of User Interfaces

Subvocal recognition (SVR)

This technology is based on detecting the internal speech that you are generating in your mind right now as you are reading these words.

The basic idea is that when you subvocalize, you actually send electrical signals that can be picked up and interpreted.  Using speech recognition, this would allow a person to control a computer just by thinking the words.  This would be a great way to do text entry to replace a keyboard, on screen or off, when this technology improves.

NASA has been working on technology related to this idea.

And a company called ambient has a product called Audeo that is already in production.  (The demo is a bit rough though.) You can actually buy the basic kit for $2000.

technology promo thumb Time Traveling To The Future Of User Interfaces

Gesture control

You’ve probably already heard of the Kinect, unless you are living under a rock.  And while that technology is pretty amazing, it isn’t exactly the best tool for controlling a PC.

But, there are several other new technologies based off gesture control that seem promising.

There are two basic ways of doing gesture control.  One is using cameras to figure out exactly where a person is and track their movements.  The other is to use accelerometers to detect when a user is moving a device, (an example would be the Wii remote for Nintendo’s Wii.)

2013 03 03 13 34 37 thumb Time Traveling To The Future Of User InterfacesA company called Leap, is very close to releasing a consumer targeted product called Leap Motion that they are pricing at only $79.  They already have plans to sell this in Best Buy stores and it looks very promising.

Another awesome technology that I already pre-ordered, because I always wanted an excuse to wear bracers, is the MYO, a gesture controlled armband that works by a combination of accelerometers and sensing electrical impulses in your arm.

two rings thumb Time Traveling To The Future Of User Interfaces

What is cool about the MYO is that you don’t have to be right in front of the PC and it can detect gestures like a finger snap.  Plus, like I said, it is a pretty sweet looking arm band—Conan meets Bladerunner!

Obviously video based gesture controls won’t work well for mobile devices, but wearable devices like the MYO that use accelerometers and electrical impulse could be used anywhere.  You could control your phone, while it is in your pocket.

Augmented reality and heads up displays

One burden of modern computing that I haven’t mentioned so far is the need to carry around a physical display.

A user interface is a two-way street, the computer communicates to the user and the user communicates to the computer.

2013 03 03 13 36 29 thumb Time Traveling To The Future Of User InterfacesSteve Mann developed a technology called EyeTap all the way back in 1981.  The EyeTap was basically a wearable computer that projected a computer generated image on top of what you were viewing onto your eye.

Lately, Google Glass has been getting all the attention in this area, as Google is pretty close to releasing their augmented reality eyewear that will let a user record video, see augmented reality, and access the internet, using voice commands.

Another company, you may not have heard of is Vuzix, and they have a product that is pretty close to release as well, Smart Glasses M100.

Brain-computer Interface (BCI)

Why not skip everything else and go directly to the brain?header set thumb Time Traveling To The Future Of User Interfaces

There are a few companies that are putting together technology to do just that.

I actually bought a device called the MindWave from NeuroSky, and while it is pretty impressive, it is still more of a toy than a serious way to control a computer.  It basically is able to detect different brain wave patterns.  It can detect concentration or relaxation.  You can imagine, this doesn’t give you a huge amount of control, but it is still pretty fascinating.

I haven’t tried the EPOC neuroheadset yet, but it has even more promise.  It has 14 sensors, which is a bit more intrusive, but it supposedly can detect your thoughts regarding 12 different movement directions, emotions, facial expressions, and head rotation.

So where are we headed?

It is hard to say exactly what technology will win out in the end.

I think we are likely to see aspects of all these technologies eventually combined, to the point where they are so ubiquitous with computer interaction that we forget they even exist.

I can easily imagine a future where we don’t need screens, because we have glasses or implants that directly project images on our retinas or directly interface with the imaging system in our brains.

I easily see us controlling computers by speech, thought, eye movement and gesture seamlessly as we transition from different devices and environments.

There is no reason why eye tracking technology couldn’t detect where our focus is and we could interact with the object of our focus by thinking, saying a command or making a gesture.

What I am sure of though is that the tablet and phone technology of today and the use of touch interfaces is not the future.  It is a great transition step to get us away from the millstone around our necks that is the keyboard and mouse, but it is far from the optimal solution.  Exciting times are ahead indeed.

If you like this post don’t forget to Follow @jsonmez or subscribe to my RSS feed.

My Virtualization Saga Continues

When I last talked about my virtualization strategy for my dev PC, I had built a new PC with new hardware, but didn’t have much of a need for portability.

Lately my need for portability has increased a bit, so I got an opportunity to put my virtualization strategy to the test.

The problem

My big problem at this point was how to get my virtualized dev PC to be portable.macbook My Virtualization Saga Continues

I recently purchased a 15” MacBook Pro with Retina display so that I could have one portable machine to use for my workshops and speaking on cross platform mobile development, but I wanted to be able to run my primary dev machine on it as well.

When I first got the MacBook, I thought I would just copy my VMWare image to the MacBook and go, but I soon found out that copying 100 gigs over my wireless connection was not exactly a quick task.

I also wanted to be able to create a few other virtual machines with different setups that I could transfer easily back and forth between my desktop PC and my MacBook.

If it was going to take me several hours to copy the VMs back and forth, this was going to be a real pain.

I also realized that my MacBook had pretty limited disk space since I had only opted for the 256GB storage.

USB 3.0 Changes things

Now that USB 3.0 is fairly prevalent, the idea of having an external hard drive to use as a virtualized PC is much more conceivable.

I decided that the best way to be portable would be to buy a decently sized SSD drive and put it in a USB 3.0 hard drive enclosure.

usb3 My Virtualization Saga Continues

With SSD prices coming down so much, it only cost me about $80 to get a latest generation 128GB SSD drive.

I put this SSD drive in a USB 3.0 enclosure and suddenly I have the ability to take my development PC anywhere I want.  All I need to run my dev PC is a computer with a USB 3.0 connection and a copy of VMWare player.

This ended up working out great.  Now I could have multiple VMs on different USB 3.0 hard drives that I could just swap out as I needed.

I could also very easily copy VMs back to my desktop if I wanted to be able to run them from there.

The test

I had my first true test of the concept this last week when I went to Austin for the the .NET Rocks Road Trip show.

I was presenting there and doing a live recording for the tablet show, but I still wanted to be able to work during my downtime.

I brought my MacBook and my external hard drives with me and I was able to successfully run my full development workstation right off my USB 3.0 hard drive just like I was sitting at my home office.

I had a few wireless connectivity issues, so I will probably add a Ethernet converter to my bag for use with my MacBook in the future, but other than that it worked out perfectly.

I finally have a really good reason to virtualize my development environment.

The future

Even though I’m pretty happy with this setup, I still want something better.

In the future, I’d like to be able to pull down the differences in my virtualized environment from the cloud.  There is no real reason to be carrying around and copying around multiple copies of the same old operating system on a disk image.

I should be able to just store what is different in my workstation and I should be able to get those differences from the cloud instead of a USB drive.

I’d also like to not have to install my apps, but rather to be able to stream the apps to my PC as I am using the app.

I think we’ll eventually get there, but for now I’m happy to carry around my USB 3.0 hard drives and have a completely portable workstation for whatever task I need.

Why My Kid is an iKid

Sophia got her first introduction to the iPad at about 3 months old.

As soon as she could sit in a rocker chair my wife and I let her start playing on the iPad. 

002 thumb Why My Kid is an iKid

We started off with just one game, Interactive Alphabet by Piikea.  It is basically a game that goes through the Alphabet and lets the baby interact with some of the pictures.

We added a few more ABC type of games as she got a bit older, but we mainly just let her play with that one game, because we figured it would be great to let her start seeing letters and learning the alphabet as early as possible.

Right from the get-go she would swat at the screen.  She didn’t immediately understand the cause and effect, but she quickly grasped the idea that when she hit the screen, something would happen.

After a while she became pretty good at being able to do the simple things in the ABC game.  She would still swat the screen, but purposefully swat certain areas in order to do something like build a sandcastle.

Around 12 months, we started adding a bunch more apps.  We added some interactive books and a couple of simple games.

img 4492 thumb Why My Kid is an iKid

Sophia was learning how to do many more things in the apps.  She could point with a couple of fingers and very purposefully touch certain areas of the screen. 

She really didn’t have any concept of touching and dragging though, and would often run into problems of having one hand leaning on the iPad which was causing the other hand’s touches not to register.

She’s now 18 months and she is an iPad master.

img 6639 thumb Why My Kid is an iKid

Sophia can now:

  • Turn on the iPad
  • Unlock the iPad
  • Pick which app she wants to play out of her folders
  • Use the home button to exit an app
  • Double press the home button to switch to a recent app
  • Navigate through menus in apps and get back to the app
  • Use the table of contents in books to pick the page she wants

She also asks for the iPad by name.  She has about 40 apps on the iPad that she subsumed from my wife.  It seems like she is learning something new every day now.

The world is changing

Our children, especially the youngest ones, are growing up in an entirely different world than has existed ever before.

world thumb Why My Kid is an iKid

I know this has been said many times before and it could be argued that my generation also grew up in an entirely different world than my parents, but I think the change we are seeing now is much more substantial.

I predict that this generation will be known as the tablet generation.  With Windows 8 now released we are going to see a rapid decline of non-touch devices.  In a few years all laptops will be touch screen retina displays.

There are some fundamental changes going on in how we interact with computers and even what defines a computer.

Yes, I know you’ve heard all this before, but why is this important?

It is important because the real shift I see is the shift between a primarily analog focused world view to a primarily digital focused world view.

For me the iPad or the computer is an attempt to replicate some process or experience in the real world.  No matter how long I work with computers or use these devices, I cannot escape my world view.  Analog always comes first.

For our children things are different.

I can’t say for sure that picking up a pencil and being able to write is a skill that will even be necessary.

It is very likely that this coming generation will view things through the digital lens first and the analog world will be secondary.

I don’t mean they’ll be jacked into computer all day and live in a virtual world, but I do think that while we try to relate software to tangible things the coming generation is likely to view software as the primary and tangible objects as secondary.

Think about music.  Ever had an 8Track?  How about a cassette tape? CD anyone?

How do we think of music today?  One word comes to mind—MP3.

What started out as a physical record eventually lost its purpose and is now so heavily digital that we tend to think in terms of the digital and don’t even consider the tangible anymore.

The same thing is currently happening with books, movies and to some degree money.

Why we let Sophia be an iKid

With the changing world, computer literacy is more important than ever before.

Even in the world we live in now, it is just about impossible to get any kind of non-labor intensive job without being able to use a computer.

If computer literacy is arguably going to be the most important skill for anyone to have in the future, why not start as young as they start to show an interest?

I think it is a huge asset to develop in our children the ability to use a computer as easily and mindlessly as the ability to eat with a fork and a spoon.

I wish I had that ability. I could be so much more efficient if I would stop writing down lists on pieces of paper and instead pull up my iPad or other tablet to jot down ideas and completely replace paper in my life.

And sure I could learn to wean myself off of the analog world, but I want my daughter to be able to think first in the digital world.  She’ll be way more efficient and see things from a better perspective than I ever will.

Aside from that, my wife and I find that the iPad is an excellent learning tool to help Sophia learn to learn.

There are so many things she is able to teach herself using that iPad.

She already:

  • Has a vocabulary of over 100 words
  • Can count to 4 in order and count actual objects
  • Can say most of her ABCs
  • Can recognize most letters
  • Can name many animals and objects

abc thumb Why My Kid is an iKid

Much of what she knows she learned at her own pace based on what she was interested in playing on the iPad.

For example, one week she’ll be playing many of the numbers apps.  For a whole month she just wanted to do alphabets.

The iPad gives her the freedom to be able to choose what she wants to learn and to do it effortlessly.  She is developing the skills to be able to self-educate.  Sure, we still read books to her and try to teach her, but she seems to get a large amount of her knowledge from what she learns playing on the iPad.  (At least the reinforcement of what she has learned.)

Overall I don’t think there is any reason to stop her from playing on the iPad.  I know some people equate it to TV, but I think it is fundamentally different.  The apps she plays on the iPad are interactive.  You can’t mindlessly sit and watch the iPad.  Instead, there is a constant feedback loop that is not present with TV.

Also we can carefully monitor the apps she uses.  The TV is an open system that brings unknown content into your house, where the iPad can be used as more of a closed system.

To summarize, I think we are preparing her for the future and giving her a huge head start in life.

How to get started

So you may be wondering how to best go about getting your baby or toddler started with the iPad.

While I’m not a child development expert, I can give you some advice from what my wife and I have learned in this process.

I’d start by picking up a used first generation iPad and a good case.

You can of course get a newer iPad or even another tablet, or the iPad mini, but just be aware of two things.

  1. Babies don’t have very precise coordination with their hands so small screen are going to be hard for them to use.
  2. Babies tend to throw things, especially when they get frustrated.

The next thing you need is apps.  My wife, Heather, wrote up this section for me.  So, if you notice the grammar is perfect and is written with a much higher skill level than my usual writing, that is why.

(Please let me know if you have some other ones appropriate for the ages.  I’d like to make a nice resource for other iKid believers.)

3 Months – 12 Months

  • Interactive Alphabet by Piikea.  This is by far the best app I’ve seen for the youngest of kids. It has a baby mode which prevents babies from exiting by accidentally batting a menu button and most of the items respond to simple taps or swipes.
  • Juno’s Musical ABCs by Juno Baby.  This app also goes through the alphabet but with a musical theme. The interactions aren’t as neat as the Piikea app and the button to return to the menu is prominent and easily pressed.
  • Peekaboo Baby. This is my app.  Warning, it is very simple.  I was learning MonoTouch and wrote it in a day as an experiment.

12 Months to 18 Months

  • Seuss ABC, Green Eggs These stories have autoplay, read to me, or self-reading features and will say the word of anything the child touches on the screen. There is actually an entire line of the Dr. Seuss books, but I prefer these two. The ABC app is great because each letter is said multiple times. The Green Eggs app is my daughter’s favorite, and I suspect this is because so many of the words in this story (eggs, boat, house, mouse, car, train, etc.) are ones most 18 month olds know. These books are a little long so if you’re more interested in the stories, go with the Bright and Early Board Books instead of these apps. The Mercer Mayer, Little Critter books are also available and tend to be shorter in length.
  • I Hear Ewe This neat little app has three screens of picture tiles: two of animals, one of vehicles. When touched it says: "this is the sound a [insert animal or vehicle here] makes:" I like this because it doesn’t require page navigation. A child can sit and do this for a short period and when they get bored, you can switch the screen for them. Sophia plays this occasionally at 18 months but it doesn’t hold her interest as much, so I suggest trying it at a little younger age.
  • Pat the Bunny by Random House. There is both a paint and interactive option with this app. The paint seems to always crash, most likely due to the mad tapping of a toddler, so I avoid it. The read option has a bunch of items on the screen that kids can interact with (turn off a light, put shave gel on daddy’s face, wave bye bye, play peek a boo, etc.) I’ve never seen the real book, but I wouldn’t be surprised if this app is better than the book. Changing screens is manual and may require adult help. There is an obnoxious Easter egg on every page that brings up the bunny.
  • Princess Baby by Random House. I was actually disappointed there wasn’t more to this app, but Sophia has played it enough that it makes the list. It begins by having you “Choose your favorite princess.” Each princess has 3 toys that can be interacted with in a very limited way: wand, drum, ball, flower, blocks, cat. The princess can be put to bed, which Sophia likes doing over and over and over again.

18 Months +

  • A Monster at the end of this book.  Starring your lovable, furry pal Grover from Sesame Street, this app has a very cute storyline. In order to advance through the book certain tasks, such as touching knots to untie the page or knocking down bricks must be performed. This is another one where the app may be better than the book itself. One bonus: the pages are locked when Grover is talking, which keeps an eager toddler from advancing through too quickly. My daughter loved this book earlier on but I had to help her with some of the action pages and it was just recently that she started doing it all on her own.
  • Another Monster at the end of this book.  Starring Grover and Elmo, some of the tasks are a little trickier than the first book (matching colors, wiping away glue), but did I mention it has Elmo?
  • Little Fox by GoodBeans. This is one of my favorite apps. It has 3 different songs to choose from and each has its own scene: London Bridge is Falling Down, Old MacDonald, and The Evening Song. Each scene is cleverly interactive and entertaining. Old Mac Donald has 4 seasons to select from and the interactions change based on the season. There is also a little "fox studio" with a ton of interactive objects used to make music.
  • Nighty Night by GoodBeans.  Adorable. The animals at the farm house need to go to sleep. This is done by clicking on the area each animal resides in and turning off the light. The animals respond to touch. Additional animals can be purchased (2 sets of 3 animals each).
  • Itsy Bitsy Spider by Duck Duck Moose.  Another fantastic app, this may be the one Sophia has clocked the most time with. In order to progress through this app, you must click on the spider. Each time the spider is touched one line of the song is sung and the spider moves. There is a lot to interact with at each spot and one the second time through the song there are decorated eggs the child can collect on the spider’s back. There is a cute little narrator fly that teaches the child about items the child clicks on (i.e clouds, the sun, rainbows).
  • Ewe Can Count.  This is a cute counting game where you count a random number of sheep, horses, apples, etc. There is a learning and a quiz mode.
  • Logic Lite.  This app is great because it teaches the complicated click and drag gesture. The full version has three additional tile sets: Numbers – match dots to the written number, Pictures – match a picture that contains a shape to the shape it contains, and Letters. The letters are great at 18 months, but the other two are too complex.

Your mileage may vary

Having your little one use an iPad might not work out as well as it has for us, so I think it is only fair to disclose some of the circumstances which govern our life that may help to make our experience successful.

  1. My wife is a stay at home mom.  She used to be a techie, but left the digital world to raise our daughter.  I only bring this up, because she interacts with Sophia all day.  If we were putting Sophia in day care, I would be more hesitant to give her the iPad during our interactive time with her.  (But I would probably try to get the day care to let her use it.)
  2. We have almost 0 TV in our house.  I don’t watch any TV at all or movies.  My wife very rarely watches TV and Sophia never does.  I think this is important, because if she were watching TV, I would also be a bit more hesitant to let her play with the iPad as much.
  3. We do LOTS of other activities.  Just about every day of the week she has either swimming, gym class, play date, or something else going on.  My point here is that she gets plenty of outside time, social interaction and physical activity.
  4. Sophia took the to the iPad right away.  We didn’t have to force it on her or even encourage her to use it.  I don’t know if other kids are like this or not, although I suspect most would be.

So doing the same thing my wife and I are doing might not be the best for you family—you’ll have to decide for yourself—but as far as our daughter has been concerned the experience has been overall positive and beneficial.

Virtualization Experiment Reloaded

Well, I tried the virtualization experiment again.

It worked out much better, but it still isn’t quite there.

This time though, I am going to stick with it.

The hardware

037step2 thumb Virtualization Experiment Reloaded

It was time for me to get new development PC hardware.  I went with a desktop again instead of a laptop, simply because I can get so much more power for the dollars and I like to have 4 monitors.

Yes, there are laptop configurations that can get you four monitors, but none of them are very nice, and portability is only a concern 1% of my time.  Even half of that time, I can just use any crappy laptop to remote desktop into my PC, so it just doesn’t make much sense to use a laptop as my primary PC.

Before I get much hate mail about this, let me say “your needs may be different, and I can respect that.”  Although, every developer should ask the question of whether or not spending the same budget on a laptop vs. a desktop PC + a crappy laptop would net more horsepower and meet the general case better.

My basic hardware for the new PC was this:

  • Intel i7-3930K CPU @ 3.20Ghz (6 core x 2)
  • 32 GB RAM
  • 2 x ADM Radeon HD 7700
  • 2 x Kingston HyperX 3K 128GB, RAID 0

First PC I did not build myself, instead built by iBUYPOWER.  (Turns out they can build the PC for just as cheap as I can and they test everything to make sure it is all compatible.)

I just put the HDs in after it was shipped to me, since I was able to get a better deal on those.

The plan

What I wanted to do for this development machine was to make it easy to backup, portable and isolated.

I essentially wanted to create a clean environment where all of my work stuff lived and not install other things on it.  Much smaller ambitions than my previous desire to create virtual machines for each kind of development task and have a separate virtual machine for my databases.

One neat thing about this plan was that I should technically be able to put my work PC on an external drive for transport and load it up on a laptop if needed or another pc if my hardware failed.

One additional part of the plan was to be able to seamlessly transfer from my old PC to the new hardware without even a day of downtime. Just create the virtual work PC on the old hardware, work on it for a few days, and when the new hardware arrives, transfer the VM.

I also had planned to use the Unity mode of VMWare to allow me to seamlessly run apps from the VM outside of the VM.

The execution

It pretty much went flawlessly.

  1. I got the new vm created on the old hardware.
  2. I worked on that vm for about 3 days.
  3. When the new hardware arrived, I set that up on one monitor and installed vmware.
  4. I then turned the old pc into a vm and backed it up in case something went wrong or I forgot some data.
  5. Finally, I copied the vm to the new hardware and fired it up.

Didn’t end up having to waste a day of work switching hardware.

vmware view pilot 5132022 thumb Virtualization Experiment Reloaded

The reality

So, I am pretty sure I am sticking with this setup. It isn’t perfect, but it works good enough.

The good

  • Speed seems fine, minimal hit since I am on a RAID 0 with 2 SSDs.
  • Reboot work PC without going offline.
  • Not having to install crap apps that bog down work PC.
  • Clear separation of work and non-work. (Helps for staying focused.)
  • Easy backup of work PC, just copy a file.  (CrashPlan does this, for me automatically.)
  • Easy file transfer to work PC and back.
  • VPN without going offline.
  • Chrome synching is awesome.  This is the real use case for it.
  • Every USB device seems to work fine.

The bad

  • Unity is kind of crap.  It really needs some work.  It works sometimes, but never really good.  Lots of issues.  Windows disappear.  It gets stuck in Unity mode.  It is also very hard to launch and find apps in the VM.  In my opinion, this whole thing needs to be reimagined.
  • Full screening the VM to 4 monitors is a pain.  A ridiculous pain.  I have to hit a cycle monitors button about 8 times, before it cycles to the mode where the VM takes up all four monitors.  There is just plain no excuse for this.  VMWare should have a setting to specify that the VM should default to full screen 4 monitors.
  • The little status bar to control the VM in full screen mode is impossible to make appear.  I absolutely HATE this paradigm.  Remote desktop does this as well, except that one has the opposite problem.  Easy fix here, hide the thing until I hit some key combination.  This would be much better than a mouse location.
  • Some slight video type problems.  Applications like Join.me get pretty pissed off about me running a VM and using Unity mode.
  • Only able to push 8 CPUs into the VM.  (Doesn’t seem to support all 12.)
  • Takes up a lot of space on my SSDs.  It is a bit of a crunch with only about 20 gig left over, but seems like I can manage.

Final thoughts

So virtualization has come a long way, but for the developer case at least, it still has a way to go.

I do think hardware for desktops has come far enough along that you can get sufficient hardware to host a virtual dev machine.  It is easy and cheap to get 32gigs of RAM and fast SSD drives.  Hardware still isn’t quite fast enough to make running a VM completely seamless, but it is close enough that I don’t feel like I am taking much of a performance hit at all.

I would like to see technology like Spoon.net become much more mainstream.  This would reduce the need to install the same apps on the non-virtualized PC as the virtualized PC.  The whole OS really needs to move to the cloud.  Applications shouldn’t need to sit on HDs (for the most part.)

Virtualization software vendors have quite a bit of work to do still.  The software has gotten pretty good, but it really sucks with handling multiple monitors and seamless application integration.  When I am working on a virtual machine or running apps that live on a virtual machine, that process should be painless and easy.  I should not even realize I am using virtualization.

Just having one dev VM instead of 3 or 4 seems to have solved most of my complaints from my previous trial at virtualization.  There is something that just feels really good about actually shutting down your dev PC at the end of a work day and knowing that you have a good dependable backup of it from various points in time.

I’d also really like the ability to be able to just boot from a VMWare image.  I know you can boot from a VHD, but Virtual PC seems to be lacking in too many other areas to use it instead of VMWare.  (Although, perhaps I am wrong on this.  Might be worth an experiment.)

At this point though, I don’t see myself going back to bare metal for my dev PC.  There are a few mild annoyances, but most of them are fixed by my simply clicking “cycle monitors” 8 times first thing in the morning to make my dev PC take up all 4 screens.  I also predict that I will go to two retina displays eventually.

Programmer Fitness Journey: A Lifestyle Change not a Diet

A little off topic from my usual posts, but I thought enough tech people struggle with some of the same problems I do for this to be an interesting post.

I’ve had an interesting fitness life.  I seem to always be swinging from one extreme to another in regards to physical fitness.

In high school I ended up deciding to start lifting weights and play sports.  I gained about 50 lbs of muscle over the summer of my sophomore year.

When I was in college, I started an acting and modeling career on the side, eventually moving to Santa Monica when I signed on with an agency down there.  I even competed in a bodybuilding competition.

But over time, perhaps due to stress, long hours, and so many life changes, I ended up gaining weight.  I would continually hit a point where I would want to get back in shape and I would do some extreme form of dieting and exercise program.

I would bounce from a peak of 300 lbs to a low of about 210 lbs.

It seems every time I would lose the weight, I would eventually gain it back.

Lifestyle vs extreme diet

The problem really is a matter of lifestyle vs extreme diet.  One of the things I have always been good at is extreme discipline.  I can do the most extreme thing for a long amount of time, but eventually it will wear anyone down.

I would do diets where I ate zero carbs for 2-3 months, eliminating anything that contains sugar, including fruit.  Or I would lift weights for 2+ hours each day.  Something like that is not maintainable over the long run, and when I finally would burn out from it, I would go the other way, undoing all that hard work.

I finally got to the point now where I’ve figured out that I need to have a healthy lifestyle instead of doing some kind of extreme diet.

What I am doing now

I am currently in the process of dropping down from 270 lbs (around Feb of this year) to probably around 210 lbs or so depending on body fat vs lean mass levels.

I’m currently down to 235 lbs, and this is how I’m doing it.

Walking while working

I bought a used treadmill off craigslist.  I have a cheap laptop that I can use to remote into my main workstation, and I set up a nice high resolution computer monitor on a wall mount by the treadmill.

img 20100824 092440 thumb Programmer Fitness Journey: A Lifestyle Change not a Diet

I work from home, so it is easier for me to do something like this.  But, I bet many people end up working from home to check email or do something else, even if you have a regular office job.

I set a pretty simple goal each day of walking about 45 minutes while I am working.  I just set the treadmill on 2 MPH and an incline which has been steadily rising and is now at 7.

At this pace it is pretty easy to type and control the track pad, but I usually try to time my treadmill time to be when I am in meetings.  Think about all those wasted hours sitting in a chair during a meeting when you could be burning calories.  If you are in a meeting with me, chances are, I am walking on the treadmill.

Walking while reading

I also set a goal for myself to read a technical book for 30 minutes a day.  No reason I can’t be walking on the treadmill while I do that.  So I grab my iPad and walk and read.  I end up burning a bunch more calories with really minimal effort.

Nutrition and portion size

I don’t do anything extreme anymore.  Basically, I just eat healthy most of the time.  I also am always aware of and limit my portion size.  This is a major change from what I used to do, but it works so much better.  I am not grumpy all the time and I am not cooking all the time.

Here is a typical day of meals for me.

Breakfast: Egg McMuffin sandwich which I make using a whole wheat english muffin, 1 egg and a piece of light pepper-jack cheese.

Mid-morning snack: An apple or a piece of fruit.

Lunch: A turkey sandwich on light bread or whole wheat bread with light cheese and light mayo.  Some baby carrots.  Some fruit.

Afternoon snack: A handful of peanuts or a piece of fruit.

Dinner: Usually a portion of a whole roasted chicken, some frozen vegetables and brown rice.  I’ll also grab a 6” subway sandwich or something else, I just limit the portion size.

Desert: A small 100 calorie or so ice cream pack or pudding.

I’m not starving, and if I am hungry during the day, I grab something.  I just make sure it is healthy.  Most of my nutrition information comes form articles I read on bodybuilding.com.  It is a great resource for learning about nutrition.

I’ll usually go out to dinner once a week, but I split an entrée with my wife.  Most restaurants serve portions that are way too big for one person.  I’ll also cook something on the weekends when I have more time.  If I am going to eat something that I know isn’t going to be healthy, I make sure that I eat less at other times during the day, and I make sure the portion size is small.

I am amazed how easy it is to control yourself when you are not starving from some extreme diet.

Running

I started running using a couch to 5k running program that has you running a 5k (3.1 mi) in 9 weeks.  I used an app for my Android phone to get started.

Since I completed the program, I now run 3 times a week for 3.1 miles.  To be honest this is probably the hardest part of my routine, but it is pretty important to get a good cardio workout and 3 times a week isn’t all that bad. It is something that I will try to continue to do for the rest of my life.  It is a good habit and doesn’t take up much time.

Lifting

I am just hitting the gym at the clubhouse in my subdivision 3 times a week and only lifting for about 20 minutes.  I used to do prolonged workout sessions for several hours a day, but I have found that 20 minutes is enough time to maintain the muscle that I have.  If I were trying to put on mass I would probably boost that up a bit, but what I am doing now I know that I could continue to do even after losing the weight.

Goals and philosophy

I set a goal that every two weeks I will need to be 5 lbs lighter.  I check my weight every day when I get up in the morning in order to keep myself on track.  Having a short goal like that and knowing exactly what makes it a success makes it easier to see if I am on track or not.

If I am not losing weight fast enough, I cut back a bit and increase some cardio to make sure I make it.  If I am losing weight too fast, I relax a bit.

Each small goal keeps me in check and propels me forward.  I also said that if I was over my weight for every 1/3 a lb over I would have to walk an extra mile each day until I made the weight, but so far I’ve never been over.

The basic idea when I started this program was to make lifestyle changes that would allow me to maintain this kind of a routine even when I am not trying to lose weight.  I know exactly what I need to do each day and it really isn’t that hard.  I don’t feel like I am on a diet program, I feel like I am just living my life and being healthy.

Oh, and I never eat fast food anymore.  Never.  It just isn’t worth it.

Merge Code In… Merge Code Out…

Merging is source control Kung-Fu.

I’ve seen many people get taken to the mat when trying to merge code.  Today, I’m going to give you a simple technique that can help save you the embarrassment of your favorite source control program kicking you right in the head.

Bring the plate to the food

Often as a kid the table would be set and dinner would be ready.  I would try and take some food from the kitchen over to my plate on the table.  (Grab the hamburger and carry it over to the plate.)

My dad would often tell me,  “Bring the plate to the food.”

Which would mean that I would have to take the plate from the table.  Bring it to the kitchen.  Put the food on the plate.  Bring the plate back to the table.  Oh, what a hassle.

Less food ended up on the floor that way.  Now it seems obvious to me.  But, back then it didn’t.

So it is with merging

It is exactly the same way with merging.

Wise-man once say:

If you want to merge code to a location, you must first merge code from the location

Anytime you are about to merge code to some branch, always merge code from that branch first.

Let’s say that you created a branch off of your trunk.  You started working in that branch and you are done with whatever you are doing there.  You are ready to merge code back up to trunk.

  • First merge trunk to your branch
  • Resolve any conflicts
  • Test on your branch
  • Then merge your branch (that has the trunk changes already) into trunk

Why can’t I just merge to the destination, why merge in first?

It may seem like a bunch of overhead, but if you’ve ever merged to trunk or a release branch and broke it for everyone, then scrambled to try and fix it, you’ll probably see the benefit in making sure that all merges to release branches or trunk are trivial.

A trivial merge is a merge that can be automatically done by your source control.  It doesn’t require human interaction.

If you merge in, and then merge out, the merge out will always be a trivial code merge.  So in reality you’re not really adding any overhead at all.  You are just handling the possibly difficult code merge on your branch as opposed to the trunk or release branch.

Another important reason is that you want to be able to test your changes with the other changes that have happened in the system since you branched off.  Most of the time other changes will be happening at the same time you are making changes.

The only way to know what the interactions will be is to test them.  The best place to test them is on your local branch so that you don’t interfere with everyone else.

Derick Bailey provides an excellent detailed description of what I am talking about in his post on merging.  He calls it the merge dance.

My dev cave

Without further adieu, here are the pictures of my dev cave I set up for my new job.

devcave1 thumb Merge Code In… Merge Code Out…

devcave2 thumb Merge Code In… Merge Code Out…