Page 1 of 1

Posted: Thu Jul 24, 2003 22:26
by wes_
When playing very fast-motion or strobing clips I get 'video tearing' - that is when the graphics card seems to start drawing the next frame before it's finished the current one, so the frame gets cut off and you get annoying horizontal 'cuts' flickering up and down the screen.
It's crap. How do we get round it?
I know that with 3D games you can set your OpenGL or Direct3D preferences to sync to refresh rate and this forces the card to stop running ahead of itself. How do we do this with video? Is it perhaps a mismatch between the refresh rates of the video card and projector/mixer?

cheers, Wes

Posted: Fri Jul 25, 2003 02:29
by ether_warpTV_
often this is a case of the hardware not keeping up with the required video playback rate, or trying to play back hi-res movies with hi data rates - try encoding the movies with a lower data rate?

Posted: Sat Jul 26, 2003 00:27
by wes_
Thanks for the reply Ether, but I don't think it's that - it happens even on one layer of a 320x240 indeo clip, or even a simple slideshow strobe-effect consisting of 2 320x240 images at 14fps. The computer is plenty fast enough (XP2100, 1Gb ram etc)

Posted: Sun Jul 27, 2003 23:27
by ether_warpTV_
what is the graphics card?

Posted: Mon Jul 28, 2003 04:36
by wes_
GeForce 2 MX 32Mb
A friend with a Radeon 9000 on a similar machine complains of the same thing.
I reckon you're thinking in the right direction though, maybe I should take this question to one of the uber-geek 3D card forums?