I’ve had an excellent setup using MythTV and a VIA M10000 based system. The M10000 is the frontend which has a 1Ghz VIA chip and onboard hardware support for mpeg2 decoding. When watching TV shows that I recorded using an analog tuner card the system showed about 10% CPU utilization! It was perfect…until digital TV arrived.
Actually the M10000 is fine for digital television if the resolution is in the standard def ranges. 720p content and 1080i content it can not handle in real time. This is because the hardware MPEG decoding is limited to a maximum resolution which the HDTV broadcasts exceed. The onboard CPU isn’t nearly powerful enough to decode the content in real time.
I decided to move my desktop system, a 2Ghz Pentium-M based SD11G5 from Shuttle, downstairs to be my new myth frontend. At the same time I decided to get rid of the Myth frontend software and instead use XBMC. I did some tests and it seemed like it would work perfectly.
Once I got everything setup I found that things did not work perfectly. My testing was with 1080i broadcasts which I assumed would be more taxing on the system than 720p broadcasts. I was wrong. The system can handle 1080i just fine but it can not keep up with 720p. This must be due to the frame rate.
I now have a dual core atom system with an NVIDIA video card. Using XBMC with VDPAU support I should be able to decode anything I throw at the system. I’ve built the system but I haven’t setup XBMC for it yet. I’ll report out when I get that fully functioning.