The difference in quality between a progressive and interlace display setup will depend just as much (or perhaps more) on how the original scene was captured as it will on how it is displayed on your TV (i.e. the camera format versus the final display path). If it was originally captured in interlaced format it will probably look better played back on an interlaced display system than it would on a progressive system.
Also, if the original was captured in progressive format then it may be difficult to tell the difference between a progressive and interlaced playback system (since, ideally, each full frame of the interlaced playback would look exactly like each of the progressive frames -- it would, in effect, appear as a progressive image that is displayed in a field-based manner).
The comments about frame rates are also a bit misleading since in the U.S. only 720p broadcasts support actual 60 frame-per-second data (e.g. broadcast 1080HD is usually restricted to 60 fields-per-second interlaced, although 30 frames-per-second progressive is possible).
Then, the situation becomes even more complex when you talk about the frame rate of the original progressive capture (was it 24p, 25p, 50p, or 60p). For example, progressive DVD playback of 24 frame-per-second, film-based material is output at 60 frames-per-second because you can quite easily use a 3:2 cadence to convert the 24 frames-per-second source to a 60 frame-per-second display format. Thus, for a DVD you simply repeat one of the progressive frames for three output frames, and then follow that with 2 repeats of the next frame which will create a 60 frame-per-second output from a 24 frame-per-second source.
Think of it this way, two frames from the original film-based source represent 2/24ths of a second from the original capture (or 1/12th of a second). Now, if you repeat those two frames using the aforementioned 3:2 cadence you produce a 60 frame-per-second output that will occupy that same time period (i.e. 5/60ths or 1/12th of a second). This type of conversion was pretty much a requirement when TVs were using analog CRTs (which had a "natural" 60Hz time base) but now that we have digital-based LCD TVs the effective display rates can vary widely, all the way from a "true" 24 frames-per-second for film-based material to 120 or 240 frames per second which conveniently are integer multiples of both 24 and 30.
So, to answer your question as to which will look better (1080i or 1080p) I'd say that it depends on a number of factors and YMMV and in some cases it doesn't even matter (when viewing under typical conditions).