Hey,
I'm using directshow in a windows program and i've noticed that I use a lot more CPU (somtimes up to 100%) than the examples programs do.
For instance, I may use 30% cpu to do simple rendering of 2 video files, and when I use vmr9 (video mixing renderer) I often use 100% of my CPU. However, when I run the sample programs with vmr9 they only use like 20-30% of CPU. (i'm running the samples on the same machine, and the problem occurs whether i'm in debug mode or not).
I don't know what I'm doing wrong. Any ideas and advice would be appreciated.
I'm using directshow in a windows program and i've noticed that I use a lot more CPU (somtimes up to 100%) than the examples programs do.
For instance, I may use 30% cpu to do simple rendering of 2 video files, and when I use vmr9 (video mixing renderer) I often use 100% of my CPU. However, when I run the sample programs with vmr9 they only use like 20-30% of CPU. (i'm running the samples on the same machine, and the problem occurs whether i'm in debug mode or not).
I don't know what I'm doing wrong. Any ideas and advice would be appreciated.