Netflix Player by Roku – Internet TV done right

I just want to say that if the future of Internet TV is anything like the Netflix Player by Roku, we are going to be just fine. I was one of the lucky ones who ordered the Netflix Player by Roku right away and have had the opportunity to play with it for the last few weeks. I absolutely love my Netflix player box – unequivocally 🙂 If you haven’t heard anything about the Netflix player, it is a little hardware device (box) that allows instant streaming direct to your TV over the Internet.

The box, made by Roku is a $99.99 one-time purchase which connects to your existing broadband (wired or wireless) connection and allows you to instantly watch content from Netflix web site. This box plugs into the same infrastructure over at Netflix that lets you watch streaming movies and TV shows on your PC. The nice thing is that this is part of your standard Netflix membership and there are no extra monthly charges. The same flat fee DVDs you receive are not impacted by your instant streaming. The Netflix/Roku box connects to any TV using HDMI, component, s-video, composite or good old RCA and you get full DVD video quality if your bandwidth permits.

I’ve had the pleasure of using this box and I have been completely and totally impressed with the design of box, the software and the actual quality of the content being streamed. Setup/installation was incredibly easy and I was able to get the box to connect to my WPA secured wireless network in seconds. The first thing the box did was download an update from Netflix and automatically update itself – nice feature. Once the box was up and running, I was able to link the Netflix box to my online Netflix account and anything in my ‘Watch Instantly’ queue was available for viewing on my TV. So I start watching Blade Runner and it’s almost an hour before I realize that I’m not watching a DVD on my TV and it’s actually being streamed live over the Internet. The picture and sound quality is unbelievable and rewind/fast-forward is decent with the little time-series snapshot of scenes to help gauge how far or back you’re going. The box supports HD but Netflix doesn’t support that at the moment but I fully anticipate Netflix enabling that feature as they build up a bigger library of on-demand material that is of HD quality.

I only have two complaints with the box and I think one of them will probably be handled in a software update. The first one is the lack of a power button – Once the box is plugged in and turned on, you cannot turn it off. There is no OFF button on the box or the remote and that’s just annoying. There is a little light that’s always on and it’s not blindingly bright or anything but I would like to be able to turn it off. The second missing feature is the lack of Closed Captioning – I think this is a big miss and a must for me as I’m often watching movies late at night while my wife and daughter are sleeping. I can live without the power button but I really want Closed Captioning enabled in the next software release – please!!

In closing, I cannot stress how good the quality of the picture is and I haven’t had a single issue with video glitches or slowness or pauses while it’s buffering or anything like that. I’ve seen several long movies along with the most of the first season of Heroes and I haven’t had a single issue. I do have a nice broadband connection with 15 Mbps down and 1 Mbp up but that’s fairly standard these days and Netflix recommends about 3-4 Mbps for the service. The other nice thing about this box and the use of the Flash memory is that it doesn’t have a fan and so its whisper quiet. I am also excited about the future as this box runs on a embedded Linux OS and Roku has released a lot (if not all) of the code under GPL. I can’t wait for all the mods/patched kernels and apps that are going to surface in the coming weeks and months.

Advertisement

Road Runner vs. U-verse

I am one of those people who hate Time Warner (because of the crappy and recently unreliable service) that can’t wait for Verizon FOIS or AT&T U-verse or anything faster to show up in my neighborhood. My dream Internet provider for home would be what people in Europe have – 50Mbps connections but I’ll settle for Verizon’s Faster Plus services that claims to provide 15 Mbps download and 15 Mbps upload. But Verizon is slowly rolling this out and I am not going to get this for a couple of years. AT&T U-verse is my only salvation as they are slowly offering service in my neighborhood and their Max plan would work for me. U-verse Max offers 10 Mbps download and 1.5 Mbps upload and that would just rock but Time Warner has been upping their game in terms of broadband speed (not service or reliability, mind you) and I am currently getting 15Mbsp download and 1Mbps upload.

Bandwidth Test

I just did a bandwidth test and discovered that I am truly getting close to 1 Mbps upstream and that’s pretty awesome as I use Mozy for my remote backup and I also use Rsync and Subversion to backup my code and other essentials files to my remote (Linux) server. My current thought is to get the AT&T U-verse service and run it side-by-side with my Road Runner connection for a while and see which one is consistently reliable and faster. I sure hope its AT&T as I would like nothing better than to dump Time Warner.

If you’re not using Mozy (or another online backup provider), you should consider getting one!

Daily del.icio.us for Apr 03, 2007

Website Performance and Optimization

A couple of months ago, I noticed that I was getting pretty close to using up all of my monthly bandwidth allocation for my server and that was a surprise. I run several blogs that get quite a few hits but I didn't think I was anywhere near going over my 250 GB allotment. So I decided to spend a little time to optimize my server and figure out the best way to utilize what I had and optimize it to get the most performance out of my little box. Jeff Atwood's wonderful blog entry about Reducing Your Website's Bandwidth Usage inspired me to write about my experience and what I ended up doing to squeeze the most out of my server.

I had done some of the obvious things that people typically do to minimize traffic to their site. First and foremost was outsourcing of my RSS feeds to FeedBurner. I've been using FeedBurner for several years now after I learned the hard way how badly programmed a lot of the RSS readers were out there. I had to ban several IP addresses as they were getting my full feed every 2 seconds – Hoping that was some bad configuration on their side but who knows. Maybe it was a RSS DOS attack :). After taking a little time to see what was taking up a lot of the bandwidth, I discovered several things that needed immediate attention. First and foremost was the missing HTTP compression. Looks like an Apache or PHP upgrade I did in the past few months had ended up disabling the Apache module for GZIP compression and so all the traffic was going out in text. HTTP Compression delivers amazing speed enhancements via file size reduction and most if not all browsers support compression and so I enabled compression for all content of type text/html and all CSS and JS files.

Some older browser don't handle JS and CSS compressed files but anything of IE6 seemed to handle JS/CSS compression just fine and my usage tracking (pictured above) indicated that most of my IE users were using IE 6 and above.

Enabling HTTP Compression compressed my blog index page by 78% resulting in a statistical performance improvement of almost 4.4x. While your mileage may vary, the resulting performance improvement got me on the Top20 column at GrabPERF almost every single day.

Another issue I had was the number of images being loaded from my web server. As most of you already know, browsers will typically limit themselves to 2 connections per server and so if a webpage being loaded has 4 CSS files, 2 JS files and 10 images, you are loading a lot of content over those 2 connections. And so I used a simple CNAME trick to create an image.j2eegeek.com to complement http://www.j2eegeek.com and started serving images from image.j2eegeek.com. That did help and I considered doing something similar for CSS and JS files but decided instead to outsource image handling to Amazon's S3.

Amazon's S3 or Simple Storage Service is a highly scalable, reliable, fast, inexpensive data storage infrastructure that is fast and relatively inexpensive. S3 allows you to create a 'bucket', which is essentially a folder that must have a globally unique name and cannot have any sub-buckets or directories and so it's basically emulates a flat directory structure. Everything you put in your bucket and make publically available is accessible via http using the URL http://s3.amazonaws.com/bucketname/itemname.png. Amazon's S3 Web Service also allows you to call it using the HTTP Host header and so the URL above would become http://bucketname.s3.amazonaws.com/itemname.png. You can take this further if you have access to your DNS server. In my case, I created a bucket in S3 called s3.j2eegeek.com. I then created a CNAME in my DNS for s3.j2eegeek.com and pointed it to s3.amazonaws.com. And presto – s3.j2eegeek.com resolves to essentially http://s3.amazonaws.com/s3.j2eegeek.com/. I then used John Spurlock's NS3 Manager to get my content onto S3. NS3 Manager is a simple tool (windows only) to transfer files to/from an Amazon S3 storage account, as well as manage existing data. It is an attempt to provide a useful interface for some of the most basic S3 operations: uploading/downloading, managing ACLs, system metadata (e.g. content-type) and user metadata (custom name-value pairs). In my opinion, NS3 Manager is the best tool out there for getting data in and out of S3 and I have used close to 20 web based, browser plug-in and desktop applications.

In addition, I also decided to try out a couple of PHP Accelerators out there to see if I could squeeze a little more performance out of my web server. Compile caches are a no-brainer and I saw decent performance improvement in my PHP applications. I blogged about this topic in a little more detail and you can read that if you care about PHP performance.

The last thing I did probably had the biggest impact after enabling HTTP compression and that was moving my Tomcat application server off my current Linux box and moving it to Amazon's EC2. Amazon's EC2 or Elastic Compute Cloud is a virtualized cloud of computing available to you for $0.10 per hour of CPU utilization. I've been playing around with EC2 for a while now and just started using it for something real. I have tons of notes that I taken during my experimentation with EC2 where I took the stock Fedora Core 4 images from Amazon and made that server into my Java application server running Tomcat and Glassfish. I also created my own Fedora Core 6, CentOS 4.4 image and deployed them as my server. My current AMI running my Java applications is a Fedora Core 6 image and I am hoping to get RHEL 5.0 deployed in the next few weeks but all of that will be a topic for another blog.

In conclusion, the HTTP Compression offered me the biggest reduction in bandwidth utilization. And it is so easy to setup on Apache, IIS or virtually any Java application server that is it almost criminal not to do so. 🙂 Maybe that's overstating it a bit – but there are some really simple ways to optimize your website and you too can make your site hum and perform like you’ve got a cluster of servers behind your site.

PHP Acceleration – Pick Your Poison

As I deployed more applications and web sites on my server, I started running into resource issues. Since most of the applications I write are in Java, I run Tomcat on my Linux server. But I also run Apache as a front-end host for Tomcat as well as several PHP applications like WordPress, Vanilla and a few other PHP applications that I’ve written. I am not an expert PHP developer by any stretch of the imagination but I tinker with enough PHP that I decided to take a look at PHP Acceleration software.

For the uninitiated, PHP is a scripting language that is interpreted and compiled on the server side. PHP Accelerators offer caching of the PHP scripts in their compiled state along with optimization. There are several PHP optimization products out there and I decided to give eAccelerator, XCache and APC a try on my Linux machine. For the record, the box is running CentOS 4.4 which is essentially a distribution that is repackaged Red Hat Enterprise Linux 4.x.

  • eAccelerator – eAccelerator is a free open-source PHP accelerator, optimizer, and dynamic content cache. It increases the performance of PHP scripts by caching them in their compiled state, so that the overhead of compiling is almost completely eliminated. It also optimizes scripts to speed up their execution. eAccelerator typically reduces server load and increases the speed of your PHP code by 1-10 times.
  • XCache – XCache is a fast, stable PHP opcode cacher that has been tested and is now running on production servers under high load.
  • APC – The Alternative PHP Cache (APC) is a free and open opcode cache for PHP. It was conceived of to provide a free, open, and robust framework for caching and optimizing PHP intermediate code.

I compiled and installed these PHP accelerators and found APC worked the best for me. XCache seemed to work well and actually provided a nice admin application that lets you peek inside the cache to see what’s cached, the hit/miss ratio, etc. eAccelerator also seemed to work well and offered a great performance boost but caused segmentation fault and made the Apache web server unusable. It could have been bad PHP code that was causing the segmentation faults but I didn’t really spend any times getting to the root cause. APC just worked, pretty much like XCache but seemed to offer a little better performance. Now I didn’t really perform any empirical testing here – I simply relied on my website monitor GrabPERF as I ran each PHP extension for a few days. Your mileage may vary based on your server architecture, application, lunar phase, etc but PHP APC seemed to work the best for me.

Put Your Linksys Router on Steroids

This is something I have been meaning to do for many years now but I finally took advantage of the Christmas break to put my Linksys Wireless Router (WRT54G) on steroids. Since I was upgrading my Windows machine from XP to Vista and my Linux machine from Dapper to Edgy (Ubuntu), I figured why not break – I mean upgrade everything.

First a little background – Linksys had used Linux as the OS of its network products including the ubiquitous WRT54G router. When Cisco acquired Linksys in 2003, they were forced to open source all of the Linksys code because of the GPL. This led to people to create updated versions of the code for these Linksys routers and soon people started adding features to the $60.00 router there were available in network devices costing a lot more than $60.00. Linksys (and Cisco) continued to make these Linux routers for a while and then switched to another real-time UNIX variant, VxWorks which removed the requirement for Cisco to release their software into the open-source community.

So I’ve been thinking about upgrading my existing Linksys router to another with Gigabit ports and so upgrading and potentially turning it into a brick didn’t seem that big a deal. In fact, a part of me was hoping the upgrade wouldn’t work so that I would have the excuse to replace a perfectly working router with another with additional goodies. There are a lot of different software packages out there for your Linksys router but I decided to use DD-WRT because of the features. I wanted to add WPA/WPA2, QOS and the ability to boost the radio transmission power. The default Xmit is set to 28mw and I bumped up mine to 70mw as the Xmit site suggested and I noticed a HUGE improvement in my wireless performance. Before the upgrade, the wireless was really weak in the other end of our house but know I get perfect connection that really awesome throughput. In fact, the strength of the signal was so high, I had to switch to another channel to let me neighbor’s wireless routers and phones work. The enhanced security was also a nice bonus – The other features like the ability to run a wireless business don’t interest me but the ability to VPN in really does. I haven’t had a chance to use that yet as I typically use a SSH tunnel to setup a proxy to securely access resources when I am using a public network but it’s a nice feature to have if you need security or as just paranoid of open/free/public networks. (As you should be)

To me, the coolest thing was the ability to SSH into my wireless router and browses the directory structure. The DD-WRT upgrade turned my router into an SSH server and so I can SSH into it to check out the configuration or even SSH out from the router itself.

Here are some screenshots taken from the interface – Before you decide to upgrade your router, please remember that there are no warranties and you could end up with a $60 brick.

Red Hat to Oracle (and market) – Oh No You Didn’t!

Red Hat shares jumped about 25% after they reported a quarterly profit and outlook that topped Wall Street forecasts. After the close, the stock is still going up in after-hours trading. Kudos to Red Hat whose stock had been hammered after Oracle announced that they were going to redistribute Red Hat’s Linux under the name Oracle Unbreakable Linux and include complete support for cheaper than Red Hat at Oracle World. Here is Red Hat’s stock chart over the last 3 months.

Hopefully Red Hat will send Larry Ellison one of those cool Unfakeable Linux T-shirts 🙂

I love the Tablet PC – And Ubuntu runs on it

As I’ve blogged before, I am in the search for a new computer and have decided to get a laptop and a desktop to meet all of my needs. I will probably end up using a Mirra or something similar (NAS) to get my machines in sync. On the laptop side, I’ve been toying with the idea of getting a convertible Tablet PC. To me, a convertible Tablet gives you the best of both worlds – It’s a laptop that has all of the functionality of the traditional laptop and yet can covert to a slate Tablet when needed.

I’ve wanted to try-out a Tablet PC before I buy it and so my brother was kind enough to loan me his Toshiba M4 Tablet and it only took 3 months of begging, nagging, threats and the other usual incentives to finally get the Tablet. 🙂 To annoy me, he installed the first beta of Vista Tablet on his machine which made it pretty much useless. In Vista’s defense, this was the 1st beta of Vista Tablet and the installer was my brother. I don’t think I need to say anymore. 😉

So I install Windows XP Tablet edition to really see what the magic is all about and I am completely in love. While I used the Tablet in a conventional laptop mode most of the time, I loved the fact that I was sitting on my couch reading blogs using my pen. While I haven’t tried it yet, I think reading an eBook on the Tablet would work really well and the mobility and folding form-factor would make it ideal for reading on the couch or in bed. My only hesitation is trying to figure out if I should jump now or wait for the dual-core Tablet PCs to ship.

On a whim and with no football on TV to suck up my time, I decided to install Ubuntu on the Tablet PC. The word Ubuntu is based on an African word meaning ‘humanity to others’ and it is a freely available Linux-based operating system with both community and professional support. Ubuntu is very easy to install and use and I am always amazed at how easy the install process and just how usable it is as a client machine. On the Toshiba M4 Tablet, I just reboot with the Ubuntu install CD in the drive and reboot. Upon boot, I answer a few simple questions about disk portioning and the installer goes away and installs the OS. While there is no support for Tablet like functionality in the Ubuntu at this moment, Ubuntu worked like a charm using the M4 as a traditional laptop. I shouldn’t be but I continue to be amazed as just how easy it is to use Linux on the desktop.

Screenshot of the desktop

I haven’t been a supporter of Linux on the desktop as all the attempts in the past never passed my parents test – Could I install RedHat or Slackware or Debian or any other Linux distribution on my parent’s computer and leave them alone with it? I never thought so – Granted, they can’t fix all the Windows issues they run into but there are a lot more people that can possibly help them with that vs. Linux. And I consider myself a Linux guy. I’ve been running Linux in one form or another since 1991 when I built my first Linux server at Marquette University that was running v0.9x kernel as part of the SLS distribution. A few years after that, I ran the Marquette University webserver on my personal Linux box (386 – 40 MHz) for a few years before people ‘got it’ and officially started supporting my efforts. In fact, I have introduced Linux in EVERY single company I’ve worked for since the early days with great success I might add. 🙂 So it’s great to see a Linux distribution that’s useable and pretty that rocks.

By the way, I took some pictures of the computer and have them on up on Flickr. Check out the Tablet PC set or view them as a slideshow.

Tablet+PC, Tablet, Microsoft, Toshiba, toshiba+m4, dual-core, laptop, mirra, NAS, linux, ubuntu, flickr

MACS and Java

Over the past 2 years, I have noticed an interesting trend among the Java developer community. The trend is to move to Apple PowerBook as the laptop of choice away from any Wintel or Lintel alternatives. Are the Mac’s better laptops or is this just a ‘follow the crowd’ mentality driving people to buy Mac’s? Is the move to Mac’s influenced by the UNIX OS under the covers or is this just a pure ‘I hate Microsoft’ sentiment?

I’ve been an proud iPod user for about 2 months now and so I am drinking a little of that Apple Kool-Aid, but I’m not sure I’m ready to give up totally on XP. I am a Linux user and run it on my other box that acts as my WebLogic, JBoss, mySQL and CVS server but I still running IDEA on XP over Linux. Maybe I just need to try out the new 17 inch PowerBook to see if I give into the allure of OS X and Apple.

The new PowerBooks are impressive. Why can’t DELL, Toshiba and Compaq take a page from Apple’s book and create some sleek looking, ergonomically usable laptops instead of the standard clunky old boxes? Oh well, I guess I’ll need to head down to the Apple store to see if I’m just missing something or this is the latest fad.