Nice information there.
So, basically, the more updated SDK kit is more flexible that it can output 1280 x 720p or it can output that same information into 1080i by using 960 x 1080 instead of 1920 x 1080 or 1920 x 540.
Because if games were developed in 1280 x 720p, and if output in 1920 x 1080i, it would require twice the resolution details to do so, and it may slow games down for twice that pixel fillrate. And if output 1920 x 540p which have about the same fillrate of 1280 x 720p, this run into problems with HDTV as that means 540p will upconvert to 1080i. And that means you only get 540 lines that were interpolated into 1080 lines. Although you do get the full 1920 vertical lines, but i guess many 1080i doesn\'t resolve full 1920 vertical lines.
So, its better to use 960 x 1080i lines instead of 1920 x 540p. Which both methods does have about the same pixel fill rate of 1280 x 720p. So, it shouldn\'t cost a performance hit when compare to output 1920 x 1080i or 1920 x 1080p. I say pixel fill rate, not frame rate for the interlace or progressive signal.
Good thing is my 1080i HDTV can resolve about 900 x 1080i, so games outputting 960 x 1080i instead of 1280 x 720p will have about the same details for my hdtv. And also, my 1080i HDTV can upscale 720p to 1080i internally so its all good to me when playing games that only can output 720p at its maximum resolution.