Interlaced vs non-Interlaced

Discussion in 'Windows Media Center' started by Steve Punter, Mar 23, 2005.

  1. Steve Punter

    Steve Punter Guest

    I have MCE2005 with two Hauppauge PVR150MCE tuner cards, and everything
    works extremely well, but there's something I don't understand.

    Analog TV signals are delivered with INTERLACING, and according to
    Hauppauge, their cards retain the interlacing when they encode to MPEG2.

    I use the NVIDIA decoder, which provides a number of options INCLUDING the
    ability to select the type of de-interlacing used when watching video.
    According to their documentation the selection of the FILM type leaves the
    files interlaced.

    Now it struck me that when playing interlaced recordings on a standard
    interlaced NTSC television that the BEST way to watch the files would be
    with interlacing intact. However, when I select Film type, I get shearing
    and jerky effects that render the video unwatchable. This doesn't make sense
    to me.

    For reference I have tried changing the Deinterlace Mode dropdown to all 3
    of the options given me (Best, Display Field Separately, and Combine Fields)
    but none make any difference.

    Can someone explain to me why I MUST deinterlace if I'm watching interlaced
    video on an interlaced TV. Thanks in advance.
     
    Steve Punter, Mar 23, 2005
    #1
    1. Advertisements

  2. Steve Punter

    Deck Guest

    Steve,

    I am not real sure but it may have something to do with 3:2 pulldown for
    films. I believe those films are 24fps where normal video is around 30fps,
    I think. Someone please correct me if I am wrong.
     
    Deck, Mar 23, 2005
    #2
    1. Advertisements

  3. Steve Punter

    Steve Punter Guest

    I believe those films are 24fps where normal video is around 30fps, I
    Oh yes, I forgot about the film frame rate being different from video.
    However, that begs the question "why don't they allow interlaced playback of
    video on a interlaced TV?". Or is this just a limitation of the NVIDIA
    decoder. Are there other decoders that ALLOW interlaced playback?
     
    Steve Punter, Mar 23, 2005
    #3
  4. Steve Punter

    JW Guest

    All s-video or composite output to a TV is interlaced since that is the NTSC
    standard. All PC monitors have only used progressive scan since day one so
    that is why PC decoders convert to progressive. It is the video card that
    controls what you send to what device not the decoder.
     
    JW, Mar 24, 2005
    #4
  5. It would, but...
    "Combine Fields" (aka "Weave" or "no deinterlacing") would be the right
    choice - but I'm afraid that won't work because there's no 1:1 pixel
    relation and probably not exact time synchronization either.

    The TV card may digitize 480 lines from the original NTSC signal, but those
    won't necessarily be output as 480 video lines as well. For one, MCE does
    "overscan cropping" unless you configure it to composite or S-Video output,
    which means it'll cut off a few lines at the top and bottom and stretch the
    remaining lines to the full screen height.

    But even if you disabled the overscan cropping by configuring the output in
    MCE to S-Video or composite, the display adapter will not necessarily stick
    to a 1:1 line relation when generating the signal for the TV output.

    So getting an accurate signal path is very difficult, if not impossible.
    And as soon as the lines are not carried over 1:1, any interlaced signal
    will be "broken" and you'll have no choice but to deinterlace it...

    Regards,«
     
    Robert Schlabbach, Mar 24, 2005
    #5
  6. Steve Punter

    Steve Punter Guest

    But even if you disabled the overscan cropping by configuring the output in
    I run MCE on a standard NTSC television, and so I use the s-video output of
    my video card (and I configured MCE accordingly). It was that fact that
    initially confused me, as I ASSUMED that there would be a 1:1 correspondence
    between the recorded data and generated TV signal.
    Thank you for the detailed explanation (I obviously cropped out much of your
    original message in the above quote). It makes reasonable sense, but it's a
    shame they can't manage to recreate an accurate interlaced signal. I've
    noticed that some of the aspects that can conspire to make the playback of
    MCE look less crisp than the original TV signal is the result of
    de-interlacing (especially if the original analog signal is slightly noisy).
     
    Steve Punter, Mar 24, 2005
    #6
  7. Steve Punter

    Steve Punter Guest

    Sorry for the second message, but something just struck me. I've often
    noticed that MPEGs created directly from DV tapes are in fact interlaced. If
    you freeze frame them on your computer during fast motion you can see the
    shearing. However, when those interlaced MPEGs are recorded to a DVD and
    played back on the DVD player, no such problem exists.

    Is this because either the DVD player or the DVD burning software has
    already applied de-interlacing?
     
    Steve Punter, Mar 24, 2005
    #7
  8. Windows does not support interlaced source video to be pass-through the 'TV'
    output.

    So you need to have the decoder take the 60fields data,
    de-interlace to 60p to have the video card in turn re-interlace this to
    60fields

    This result in a quality lose since field order will be mismatched 50% of
    the time.

    Microsoft team can write a good word processor and spreadsheet software,
    but their attempt at audio video/multimedia is a complete disaster :(

    When I see this....
    http://www.smpte.org/engineering_committees/lipsync.pdf
    Why cant Microsoft recognize their own mistake and initiate action to fix
    the
    countless of error they have introduced in their design?

    No respect of video signal levels / color space
    No respect of audio / video sync
    No respect of interlaced signal format integrity
    .....

    The lack of care/interest/action from Microsoft in the past 5 years should
    be more
    then alarming to the audio/video community.

    Stephan
     
    Stephan Schaem, Mar 24, 2005
    #8
  9. Steve Punter

    Deck Guest

    I suppose the ideal solution would be to output the native resolution of the
    media out of the video card and then run it through a scaler whose purpose
    is to match it to your tv's native resolution. DVDo makes a good external
    scaler but of course rather expensive. Your picture quality is ultimately
    at the mercy of the scaler employed by the video card vendor.
     
    Deck, Mar 24, 2005
    #9
  10. Have you read up on some of the Longhorn presentations? While they were
    somewhat sketchy about details, the tagline was vast improvements of the
    multimedia qualities. So Longhorn _may_ just be the answer to your question
    above.

    Regards,«
     
    Robert Schlabbach, Mar 24, 2005
    #10
  11. From what I read, MS identified/mentioned only a set of the existing
    problems.

    Since most of the major design/bug fix dont need to wait for Longhorn,
    I'm not confident MS even understand the situation?

    MS also showed no sign of caring for those issues in the past 4 years,
    so I'm not sure what to think of the situation, specially when they are
    keeping
    their spec a secret to industry developers...

    Stephan
     
    Stephan Schaem, Mar 25, 2005
    #11
  12. The ideal solution for "TV out" (S-video / composite / Component)
    is for anAPI to exist, and expose the native resolution of the configured
    format.

    Currently this is a big hack only exposed via driver specific features
    that MS refused to address in the past 4 years.
    And their is no clue if this will be addressed at all in Longhorn.

    Stephan
     
    Stephan Schaem, Mar 25, 2005
    #12
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.