HDTV stuttering

Discussion in 'Windows Media Center' started by Dave, Jun 22, 2005.

  1. Dave

    Dave Guest

    I have the following home-built MCE2005 system:

    Celeron D 2.8GHz
    1GB DDR
    250GB SATA HD
    Gigabyte ATi 9550-based video card (128MB)
    Hauppauge 500MCE dual-tuner card
    Avermedia AVerTVHD MCE A180 HD-OTA tuner card
    NVidia decoder 1.0.67 (or something like that)

    I was getting stuttering when viewing dynamic HD content, so I recently
    purchased an Asus NVidia 6600 (not GT)-based board to take advantage of
    Purevideo. Several questions about this setup:
    1. Should I download the drivers from NVidia or Asus--I ask becasue I
    couldn't find MCE specific drivers on the Asus website, but could on the
    NVidia site. What's the difference in the drivers? If NVidia is the way to
    go, is there any particular version of Forceware I should steer clear of (as
    I found to be the case for some versions of ATi's ACatalyst software)?
    2. What are the chances I'm going to fix my stuttering problem? I know
    that requires some speculation...but if anyone feels I'm on the wrong path,
    please tell me...and tell me why you think so.

    For the record, I did defrag my hard drive, and the problem did not go away.
    Also, the entire system works fine for all other applications. Thanks.

    Dave
     
    Dave, Jun 22, 2005
    #1
    1. Advertisements

  2. Dave

    Curious Guest

    You should use the 72.14 MCE drivers from the NVIDIA Website (suggest you
    also download the display users manual from the same download page),
    especially since these drivers support NVIDIA Pure video functionality when
    used with a 6600 card and the 1.0.67 DVD Decoder.
    Your stuttering problem should be eliminated since the graphics processing
    power of the 6600 GT far exceeds that of the 9550.
     
    Curious, Jun 22, 2005
    #2
    1. Advertisements

  3. Dave

    Pete Delgado Guest

    Unfortunately, the 6600 which he states that he bought, does not meet the
    minimum requirement for memory bandwidth as required by Microsoft. The base
    PCI-X 6600 has a memory bandwidth of 8.8GB/s while HDTV throught Media
    Center requires 10GB/s. The AGP version is slightly slower.

    It is possible that this is the cause of the problem. If the card works
    correctly on normal TV and the drivers being used are known to be stable, I
    would suspect the graphics card. If you can playback the recorded HDTV
    through MediaPlayer with no stuttering, then I would be almost certain that
    the problem is memory bandwidth.

    I have been able to use an NVIDIA GeForce FX 5500 for HDTV which has a
    memory bandwidth of 6.6GB/s, but I was getting random crashes, stuttering,
    digital artifacts on the screen and other little problems similar to the OP.
    Upgrading to a 5900 SE with a bandwidth of 22GB/s cured all of the problems.

    I'll be testing an XFX 128MB 6800 card soon that retails for just over $200.
    For the money, it appears to be a good multi-purpose card for MCE and
    gaming.

    -Pete
     
    Pete Delgado, Jun 22, 2005
    #3
  4. Dave

    Dave Guest

    Dave, Jun 22, 2005
    #4
  5. Did you calculate that number as (buswidth/8 * memclk)? 128/8*550 would be
    8.8GB/s, but the base PCIe 6600 memory clock is 500MHz, and the memory is
    *DDR*, so it is (buswidth/8 * memclk*2) = 16GB/s memory bandwidth. The AGP
    version only has a memory clock of 450MHz, thus yielding 14.4GB/s.

    You'll find those numbers on the official NVIDIA website here:

    http://www.nvidia.com/page/geforce_6600.html
    Isn't the 6800 the chip with the broken video engine in which exactly the
    HDTV acceleration doesn't work? According to NVIDIA, the 6600 has the
    bugfixed video engine, so I'd say the 6600 is the better choice if you want
    both video acceleration as well as 3D gaming.

    Regards,«
     
    Robert Schlabbach, Jun 22, 2005
    #5
  6. Dave

    Dave Guest

    Actually, if your formula is correct, Pete's numbers are right. The clock on
    my card is 275 MHz...only gets to 550 after you double for DDR:

    http://usa.asus.com/prog/spec.asp?m=N6600/TD&langs=09

    So, I should expect the HDTV to continue to stutter if I use this card?

    Also, where did the 10GB/s min bandwidth number come from?

    Thanks.
     
    Dave, Jun 22, 2005
    #6
  7. Dave

    Pete Delgado Guest

    Those numbers are for the 6600 GT model, not the *base* 6600 model. I took
    my numbers from one of the manufacturers web sites as many times Tom's
    hardware guide will show the theoretically possible numbers, but each
    manufacturer will achieve slightly varying results based upon their
    implementation of the reference design.
    I had understood that to be the case with the earlier 6800 boards. It seems
    that the on-board HDTV accelerator was disabled via software due to some
    problems. I'll contact XFX or NVIDIA just to make sure they are fixed.

    Do you have links to information about the bug in the 6800?

    -Pete
     
    Pete Delgado, Jun 22, 2005
    #7
  8. Dave

    Pete Delgado Guest

    Take a look at the README file for the HDTV rollup for MCE.

    -Pete
     
    Pete Delgado, Jun 22, 2005
    #8
  9. What is on-board HDTV acceleration? I had thought that the only part of the
    6800's that didn't work was WMVHD acceleration. Does this same function
    accelerate HDTV as well? If so this could be part of my problem displaying
    HD.

    -Scott
     
    Scott S Sikora, Jun 22, 2005
    #9
  10. Dave

    Pete Delgado Guest

    We are referring to the on-board HD MPEG-2 decoder. The 6600 & 6800 have
    hardware acceleration for HDWMV in addition to the HD MPEG-2 decoder.
    Initially, the HDWMV decoder was disabled on the 6800 GT and 6800 Ultra from
    what I can gather.

    http://www.theinquirer.net/?article=19314
    http://www7.graphics.tomshardware.com/graphic/20040414/geforce_6800-21.html
    The base model 6800 appears to work correctly. It also appears that there
    are patches available for the other cards based upon the 6800 GPU.

    http://www.avsforum.com/avs-vb/showthread.php?p=5738767#post5738767


    -Pete
     
    Pete Delgado, Jun 23, 2005
    #10
  11. Dave

    JW Guest

    JW, Jun 23, 2005
    #11
  12. Wow, indeed. It appears the 6600 non-GT is NVIDIA's "go as cheap as you
    can" model, allowing card makers to drop the slowest and cheapest memory on
    it... :(
    Admittedly, I don't know the math behind this. A simple 1920x1088x30fps x 4
    Bytes/Pixel only yields 250MB/s. Even with the GPU writing the decoded
    image into the video RAM and two RAMDACs simultaneously outputting it,
    that'd still only be 750MB/s. Of course the GPU would also have to read the
    encoded image, but that wouldn't make these numbers tenfold. There must be
    something memory-intensive about the decoding process which I'm not aware
    of...

    ....so we'll just have to trust this number from Microsoft until someone
    comes up and explains it all to us ;)

    Regards,«
     
    Robert Schlabbach, Jun 24, 2005
    #12
  13. Very clever. When the 6800 was launched, it was touted to support decoding
    and encoding of MPEG-1/2/4, WMV9 and DivX. Then NVIDIA had to admit that
    the engine was broken and only 10-20% (instead of 60%) of the WMV9/MPEG-4
    decoding process can be supported in hardware - no more than with the
    previous GeForce FX series, for which such support was never claimed!

    And now in this article they say: "Everything's fine, the 6800 supports
    MPEG-2 decoding. Oh, and the 6600 additionally supports WMV-decoding." No
    word about their initial claim that the 6800 was supposed to support
    WMV-decoding already...
    The question is _which_ model 6800:

    NVIDIA_NV40.DEV_0041.1 = "NVIDIA GeForce 6800"
    NVIDIA_NV41.DEV_00C1.1 = "NVIDIA GeForce 6800 "
    NVIDIA_NV48.DEV_0211.1 = "NVIDIA GeForce 6800 "

    Are all these the same, or are some possibly the revision with the bugfixed
    video engine? But - which...?

    Regards,«
     
    Robert Schlabbach, Jun 24, 2005
    #13
  14. No. It seems as if NVIDIA didn't specify any "reference" memory clock for
    the 6600 and sold it as their "throw your cheapest memory on this" model...

    But another interesting thing:

    | The Nvidia NV43 6600 is a native PCI Express video graphics processor
    | compared to the earlier GeForce 6800 which are native AGP and use a
    | bridge chip to PCI Express. For the AGP GeForce 6600 graphics card
    | versions Nvidia use a PCI Express to AGP bridge chip.

    Is that still true, or has NVIDIA "silently" released newer 6800s which are
    native PCIex?

    See, that's why I wouldn't buy an 6800: You just don't know easily what
    you'll get. Will it be the original AGP chip with the broken video engine,
    or is there possibly an updated chip that's native PCIex and has the video
    engine fixed...?

    That's why I stuck with the 6600GT. You know it's true PCIex and that the
    video engine is fixed...

    Regards,«
     
    Robert Schlabbach, Jun 24, 2005
    #14
  15. Dave

    Pete Delgado Guest

    Which is what happens in the "lower end" or LE cards. They just didn't give
    it the designation as it would scare people away.
    The chips were native PCI-x but they released the initial version as AGP so
    as not to repeat the mistakes of the previous line of cards.
    I contacted XFX support and they were *clueless*. I have a contact within
    NVIDIA and I'll see if I can get hold of them for comment. From what I have
    read, the "light" or 128MB models have the fixed video engine. In addition,
    there is a firmware upgrade that "fixes" the issue in older cards.

    Still makes me a bit wary though. $200 is $200!
    The 6600GT is a *damn* good card and you really can't go wong with it. I
    just like the abilities of the 6800 and I need an AGP card for the current
    system I am building.

    -Pete
     
    Pete Delgado, Jun 24, 2005
    #15
  16. Dave

    Dave Guest

    Thanks, guys. Very helpful. I RMAed the Asus 6600 and picked up an eVGA
    GeForce 6600GT.

    Anyone have any idea what I should expect now on CPU burden? My chip,
    again, is a Celeron D 2.8GHz, but if anyone has numbers for any chip, I'd be
    interested to hear.
     
    Dave, Jun 24, 2005
    #16
  17. Dave

    Pete Delgado Guest

    I've got one system with a 2.66GHZ Celeron D that works great. Since the
    majority of the video processing is passed to the video card, it rarely
    strains the CPU. If you need more info I'll get some numbers for you, but
    the bottom line is that the processor you have chosen is fine.

    Did your "stuttering" problem go away?

    PS: How do you like the Avermedia HD card?

    -Pete
     
    Pete Delgado, Jun 24, 2005
    #17
  18. Dave

    Dave Guest

    Pete-

    Thanks for the info on the processor.

    I get the video card in the mail on Monday...so I'll update you on the
    stuttering after that.

    The Avermedia card is great; however, I have yet to see it in its full
    non-stuttering glory. Hopefully, I'll continue to rave once the 6600GT is
    installed.

    Dave
     
    Dave, Jun 25, 2005
    #18
  19. Dave

    Dave Guest

    Pete-

    The 6600GT did the trick. HDTV is silky smooth now. Top of the case feels
    a bit hot, but that's to be expected, I guess.

    So that's it...my HTPC is finally finished.

    Dave
     
    Dave, Jun 28, 2005
    #19
  20. Dave

    Pete Delgado Guest

    Congratulations! Now, let's figure out a way we can trade HD content! :)
    ...untill you need another HD because you filled up your current ones with
    programming! ;) That's what has happened to me -several times!


    -Pete
     
    Pete Delgado, Jun 28, 2005
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.