Being that the Cell chip is the god of all chips (being slightly sarcastic, but that's what the media has been saying), why wouldn't Sony be able to create a software H.264 decoder? Who cares if it doesn't have hardware support? Doesn't the new Quicktime support H.264? I thought I had d/l a demo video recently of a H.264 encoded HD feed. It ran decently on my oldish middle of the road computer so I'm sure the PS3 won't have an issue with it.
The real problem is that the studios have not defined what HD format they will be releasing the content in. A number of criteria (resolution: 720p vs 1080i, entropy encoding method: CAVLC vs CABAC, overall bitrate) will determine just how much compute power you will need to decode the HD stream. The issue is that the HD streams can require an order of magnitude more storage space than conventional 480p (DVD res), but the new media does not offer an order of magnitude more storage than today's dual layer DVD's. Consequently, the studios really need to crank up the compression to get the storage requirements down.
So yes, QT7 supports H.264 and does an admirable job of it (particularly on a Mac where it is highly optimized), but if you throw a 1080i 30Mb CABAC stream at a top-of-the-line dual-CPU G5, it will just barely be able to decode it in realtime. The Pentium4 is even worse... you need a dual core CPU clocked at around 4.5Ghz (which doesnt exist) in order to accomplish the same. So yeah, the stuff that's encoded for downloading and playing on joe-shmoe computers will be very low bitrate, use CAVLC, and perhaps be limited to 720p. That's easy. Now throw a hard stream at it, and it'll bring your PC to its knees. To make matters worse, stuff you're downloading off the net doesn't have any of the AACS encryption or any of the multi-plane compositing requirements that BluRay and HDDVD movies will have.
The bottom line is that the studios can make it pretty much impossible to decode their content on today's conventional PC's without a dedicated decoder. Will they go that far? They might. They've never liked the idea of PC's playing back DVD's (because they are inherently less secure than a consumer electronic player). They are even more freaked out about the thought of their HD content being pirated. To them it makes a lot of sense to make it hard to play back their stuff on PC's. Plus their bigger incentive to crank up the compression is to fit the same level of content you expect on a DVD onto the new format discs (ie trailers, featurettes, bonus features, etc)
Now going back to whether the cell can do the H.264 decode in SW. For less taxing streams, sure. For the kind of stuff I'm talking about above, it's not so clear. One of the big factors is the CABAC entropy decode which can only be done on the master CPU in a cell structure. I'd say it might be possible to pull off in SW but will require some masterful programming and use of the SIMD units as well as leveraging the GPU.
I think the reason Sony was saying they were sticking to MPEG2 for HD video (and I'm pretty sure sticking with Mpeg2 doesn't prevent you from having HD resolution) is because they believe they can get better video quality from it, vs some Mpeg 4 varient that hasn't really been fully worked out yet.
Right, MPEG-2 can support 1080i and such as well (and in fact, that's what's being used for HDTV broadcasts on satellite/cable). MPEG-4 has been fully worked out, and you can get better compression than MPEG-2. In fact, H.264 is actually MPEG-4 part 10.
If you have huge storage space, as Blu-ray does, u don't have to be all that worried about compression the data as much as u possibly can.
As I noted above, the problem is that the HD video can take up waaaay more space than standard DVD's, but the storage medium hasn't grown as quickly. Hence the reason for H.264 -- otherwise they would have just stuck with MPEG-2 and be done with it.
Also, was the PS3 originally not supposed to have a GPU? I'd be surprised if that were the case. I've read that the Cell chip is challenging to program for because of multiple cores that it has. I'm sure that will become less of a problem as developers become more accustomed to it and better tools come out. Even the 360 cpu has 3 cores each capable of handling 2 threads simultaneously.
Yeah Sony had originally anticipated that the Cell SIMD units would be rendering all the graphics and that they would not require a dedicated GPU. Guess they were wrong
The problem with the multi-core thing on the Cell is that it's not the same as the 360 CPU. On the Cell you have one main in-order master CPU and a bunch of slave SIMD CPU's. The code you run on those have to be algorithms which are inherently parallelizable, and there are all kinds of nasty issues with how you DMA the data to and from each of these cores. This is very different from a traditional multi-core CPU that can run multiple generic threads. Bottom line is that it's very powerful, but limited to a certain class of algorithms and seriously hard to code for.
Plus the initial games aren't even that good. They're a batch of very rushed games. I'm not a Sony fan boy (I think sony as a company sux lately) but I think the PS3 is gonna own the 360 when it comes out.
Yeah, but most consoles have this issue when they come out (other than Nintendo). The initial titles suck, and over time the developers get better at understanding the systems limitations and working around them and extracting better performance. All of the early titles are largely developed on very early developer boxes that don't even properly represent the final hardware. I think it's a bad idea to judge what the console is capable of based on the current slew of games. Give them a few months at least. The same will hold true on the PS3.