Retina display for Apple: Awesome for everyone

The internet is rife with gossip that the new iPad will have a retina display. According to MacRumors, who apparently found an LCD that fell off the truck, it’s going to have four times the resolution as the iPad 2, or 2048×1536, also known as QXGA.

I’ve gotta tell you, I’m really excited about this, and not just because I’m a big Mac fanboy. And you should be too.

Have you been monitor shopping lately? Yes? Then you know how much it sucks. Why? Because of the curse of 1080p.

A while back, someone decided that the “right” resolution for TVs was 1920×1080, and the best way to draw the picture was progressive scan (as opposed to interlaced), so 1080p became the prevailing TV resolution. Which was OK, but still not great. Somewhere along the way, though, things went off the rails…

Companies making monitors apparently decided, somewhere in the 2005-2010 range, that everyone wanted to watch movies on their computers, which is a logical statement. I don’t only want to watch movies on my computer, but I would like to do that. And apparently the monitor makers thought that’s ALL I would like to do, because over the past few years, pretty much every monitor that you can buy at a decent price is 1080p.

Don’t believe me? Check out Pricewatch’s list of 27 inch LCD monitors. Lowest price is $264 as I write this, and it’s 1080p. The next nine are, too, until you get to this $900 beast, which runs 2560 X 1440. Say, that sounds pretty close to the iPad’s 2048×1536…

Yes, ok, I’ll admit it. The reason that I want the iPad to have a retina display is so that every other LCD maker in existence is shamed into making better screens. The ability to hold a $1,000 tablet that includes an 9.7″ screen that pushes nearly twice the pixels of any screen you can buy in the whole big box store down the road should be sufficient embarrassment to get them off of their butts and their laurels.

There’s absolutely no technical reason for this kind of stagnation, as Apple will hopefully show everyone. If it takes Apple to accomplish what NEC or ViewSonic should have been doing years ago, then so be it, but we need better displays, both in terms of resolution and acreage. There are only so many terminal windows that fit on 1080p, and I’m tired of having to add more screens to get the space I need.

Remedial thoughts make you question things…

I’m in the middle of writing a networking primer (for the 3rd time. sigh.) and I’m in the middle of the “teaching binary” section, and it’s got me thinking about an old joke…

There are 10 kinds of people in the world.

Those who understand binary, and those who don’t.

Yes, hahaha, very amusing. You can even buy it on a shirt. But here’s what I just realized…it doesn’t work as a spoken joke at all.

While writing about binary and decimal, I’ve had to be very careful to pay attention in my sentences to whether I’m writing the word for a value or I’m writing a number. The number 10 is very different than the value ten. The joke works with “There are 10 types….”, but you can’t say “There are ten types…”, because 10 isn’t ten in binary. In binary, 10 is two.

I may be the slowest horse here, but the clear distinction just occurred to me that a written word indicates a value. I am now retroactively aggravated at everyone I’ve ever talked to that pronounced “10” as “ten” when they meant anything except the decimal number 10.

Am I wrong?