Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

IBM 306m PCI video card issues

Status
Not open for further replies.

Airninja6r

Technical User
May 19, 2011
3
0
0
US
I have a 306m currently running windows 7x64. It runs great, but I was not happy with the on board video. I have a riser with the PCI express x8 and PCI-X and a 512mb radeon video card in the PCI-X slot. Everything seems to be working fine and the card shows in the device manager. When I go into the bios it only shows the on board 16bit video card as an option, but in the PCI section it shows there is something installed. When you plug in VGA or HDMI to the PCI card there's nothing. I disabled the on board in the device manager, the on board still displays but the PCI doesn't.

My first thought was there may be some sort of conflict, but I now think the server is unaware the PCI card is a video card and doesn't utilize it as such.

Does anyone know a way to force the server to recognize it as a video card? I read somewhere clearing the CMOS could force it to pick up the video card, but I also was told that won't change it.

Maybe there is an older BIOS that allows you to choose the default video display like a normal PC would have.
 
Why would you need anything other than the built in 8 meg video on a server? Ideally you would only need the video when setting up the OS, and applications, and never use it unless needed. My guess would be the server being a 1u and very low power, doesn't have a large enough power supply to support the video card with everything else in the system.
 
I am building a standalone stream server. It currently holds a collosus capture card in the pci-e slot. As for the video card I need it because the 16mb onboard video does not display 1080p via HDMI. The server also doesn't have onboard sound so this would also create an issue for streaming.

Power is not an issue at the moment the server is built to hold 2 hard drives, a CD drive and 2 pci-x cards. As I stated in the first post I just need the server to recognize the PCI card as video.

Thanks for the reply though.
 
Why? It's not like it's playing the media? you are streaming it to a device that has to display the format you are streaming it in. The server does nothing but serve the data, what happens at the other end is what determines what you get on your TV, or computer screens. Say you are using a media extender like the xbox360, it requests from the server a file, that file is in wm9. The server displays nothing, you don't even need a monitor hooked up to it. The xbox does the decoding and playing of the file. If you want true media streaming at 1080 you are going to need a much bigger server than the starter webserver you have.
 
OK so here's whats specifically going on. The capture card is not capturing the full resolution because the servers display is maxed out at 1024x726. It is only allowing input of the same 1024x726. I need the video card to force the server to display a higher resolution. That's what my theory was anyway.

I will test the stream software solo without display, maybe you're right and I wont need the card. I will not stream in anything less than 1080i and I will keep you posted if the server is strong enough to run it. Specification wise it is comparable to machines I know are capable of the feat.

I may have to upgrade servers and the worst part is I had a great deal on a dual processor one for only $150... or a quad processor for $200.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top