why not just go ahead and do it and let use know. I actually have not done any of these hi-def stuff (at lease not bru-ray anyway). If I have to guess, I would think HDV will be re-encoded.
If I use video s/w such as Adobe Premier Elements or Cyberlink PowerDirector to get the HDV from my miniDV Canon Hi-Def camcorder to my PC via firewire 400, I should get an mts2 file w/ no loss at all (no re-encoding?). If I then use one of these packages to encode it for Blu-Ray DVD, what's the best encoding for blu-ray that will result in least loss of video quality? I read that blu-rays need to support MPEG2, MPEG-4 AVC, and VC-1 and that studios used to use MPEG2 for the first series of blu-ray discs but now use MPEG-4 AVC or VC-1. Since mts2 uses MPEG2 for video, I assume the HDV is transferred w/ no re-encoding (only the mp2 audio stream needs to be re-encoded to AC3). Is this true?
If I take snippets of 3 HDV (mts2) videos and combine them into a new video on my computer, I assume the video of each just gets appended to each other, without needing re-encoding, correct? I'm assuming that buying a computer from Dell w/ a dual core intel chip or better will have the necessary bus, video card, memory (8GB), and hard drive speed to handle the editing? I also assume that I don't need a video processing/capture card, given that the latter is only for capturing analog?
As an aside, will an AVCHD-to-blu-ray-DVD transfer result in less loss of video quality than HDV-to-blu-ray-DVD? I know this also has the following important and implicit question: which of the two encodings (AVCHD or HDV) has less loss when encoding the original sensor output (note: all things being equal - e.g. same sensor resolution)?
Thanks,
Bill

Chowhound
Comic Vine
GameFAQs
GameSpot
Giant Bomb
TechRepublic