An LCD display is a fixed pixel display. In other words the display only has one output resolution, most are 1280x720 or 1920x1080. Interlaced (i) and progressive (P) designation are hold overs from the CRT scanning method of display. Input sources are still interlaced or progressive, but the LCD TV will convert them to its native, or fixed, resolution. When the specs for an LCD says 1080i they are referring to an input capability not a display capability.
Since LCD displays are inherently "Progressive" any interlaced input will be de-interlaced;
When an LCD with a 1280x720 resolution receives a 1080i input signal the TV will de-interlace the signal and scale it to 1280x720. You do reduce the resolution when this happens, how much it affects the picture will depend on the quality and complexity of the source, and the quality of the de-interlacing and scaling in the TV.
When an LCD with a 1920x1080 display receives a 1080i signal, the TV will de-interlace the signal, but it does not need to scale the signal so you will display the full 1920x1080 information from the source. If the input signal is 1080P the TV does not need to de-interlace the signal and can, but does not always, display the signal without any additional processing.
There are several generally available sources for 1920x1080 material;
1)HD TV (cable, satellite, and over the air) - these will be 1080i or 720P.
2)HD-DVD or Blu-Ray players - these can be 720p, 1080i or 1080P
3)Upscaling DVD players - these can be 720P, 1080i or 1080P. These upscaling DVD players are are of course just scaling the 480i from the DVD.
There has been some discussion about the difference in using a 1080i and a 1080P input with a 1920x1080 LCD, or DLP, TV. Some say there will not be any difference in the picture, some say there will. Let's take a Blu-Ray player as an example. The Blu-Ray disc is encoded in 1080P @24fps (usually). Most Blu-Ray players (probably all) will convert the 1080P/24 to 1080i/60 then either send the 1080i signal to the Tv or de-interlace the 1080i signal and send the resulting 1080P signal to the TV. Since a 1920x1080LCD will always display in 1080P mode, the TV will de-interlace the 1080i signal and display it, or if it receives the 1080P signal, display it without de-interlacing. In either case we are displaying a 1080P signal. So, if there is a difference in the picture we see with the 1080i and 1080P inputs it would seem to be related to differences in the quality of the de-interlacing that is done by the Blu-Ray player and the TV. The de-interlacing process is not a minor issue. In the few comparisons I've done, the only time I could tell a difference was when I used an outboard processor and the differences where minor then.
In summary, if you get a high quality LCD you will be tickled pink with a 1280x720, and likely, tickled red with a 1920x1080.