2012-01-16, 22:20
Hello all,
I hope someone can help me out with this. Because i tried everything, and i am stuck right now.
I do have a HTPC with a Geforce 9400GT graphical card.
The VGA is connected to my Samsung syncmaster 2243 lcd monitor, with a resolution set to 1920x1080 60Hz.
The HDMI is connected to my Samsung 40' HDTV, with a resolution set to 1920x1080.
Like this everything is working fine.
Last week i bought an amplifier: Harman Kardon AVR260. This amplifier has 3x HDMI input, and 1x HDMI monitor out.
When i connect the amplifier to my 9400GT with HDMI, my LCD monitor (the VGA connection) turns black, and gives the following error: 'Not optimum mode, recommed mode 1920x1080 60Hz' My HDTV workes fine.
I cannot understand, why the graphical card changes the VGA resolution, when connecting a device to HDMI.
Does someone have a solution for this?
Is it possible to 'lock' the VGA resolution, so it doesn't change automatically?
Thanks in advance.
Dennis
I hope someone can help me out with this. Because i tried everything, and i am stuck right now.
I do have a HTPC with a Geforce 9400GT graphical card.
The VGA is connected to my Samsung syncmaster 2243 lcd monitor, with a resolution set to 1920x1080 60Hz.
The HDMI is connected to my Samsung 40' HDTV, with a resolution set to 1920x1080.
Like this everything is working fine.
Last week i bought an amplifier: Harman Kardon AVR260. This amplifier has 3x HDMI input, and 1x HDMI monitor out.
When i connect the amplifier to my 9400GT with HDMI, my LCD monitor (the VGA connection) turns black, and gives the following error: 'Not optimum mode, recommed mode 1920x1080 60Hz' My HDTV workes fine.
I cannot understand, why the graphical card changes the VGA resolution, when connecting a device to HDMI.
Does someone have a solution for this?
Is it possible to 'lock' the VGA resolution, so it doesn't change automatically?
Thanks in advance.
Dennis