Seems not everyone believes me when I say that for filmed content (24fps), 30i gives the same end result as 30p. Well, check out the latest review of the Samsung BD-P1000 player for a bit of a hint.
The review very clearly states (and you can confirm for yourself if you search the web) that the Samsung's "True High Definition" 1080p output is produced by, wait for it, decoding at 1080i in one chip and de-interlacing back to 1080p in another chip... which is exactly what your TV will do with the Toshiba output! They even use exactly the same 1080i decoder chip that the Toshiba HD DVD players use; the only difference is that the wire between the two chips is longer :-)
You can also find comments from reviewers that switching the Samsung from 1080p to 1080i produced no difference in visual quality... surprise, surprise (or not).
Don't let the "True High Definition" marketing fool you -- there is no difference for motion picture content. Sony even calls their 1080i equipment "true high definition", so it must be true (check the HDV Recording Format section).
The place where interlaced versus progressive really matters is when you are capturing footage. Footage captured at 30i (60 fields) will not look as good as footage captured at 60p, and will look "differently bad" than footage captured at 30p (the 30i might flicker and the 30p might be blurry).
But for 24p filmed content being output back on a TV from a 1080p source, there's no difference.
Oh, and the review of Blu-ray isn't very favourable... :-)
[Jim: Sorry if this seems like too much of a "marketing" post, but I think educating people about the facts around interlaced output is important and can save consumers a LOT of money]