HDMI video card doesnt output HDMI signal to monitor?
-
Ok so I just bought a new tv and a new video card. I installed the video card, connected the card to the tv using hdmi, and turned on the computer. Computer booted up fine, let me know there was a new video card and that I needed new drivers. I could see everything fine from my computer, just had the wrong resolution for the tv due to not having the drivers installed. So I went to ATI's website and downloaded the appropriate drivers for my card. It went through the whole install and said everything was installed fine and needed to reboot for changes to take effect. So I clicked "OK" and it rebooted. When the computer restarted, I could see the initial start up screen and all that including the windows logo right before the logon screen. Right when the log on scree was about to show, the monitor said "No Signal" and turned blue. I knew from just always being on how to log on to my name without seeing it. So I did that, and heard the start up sound (when it shows the desktop after a restart), but still no image on the screen. Any ideas how to fix this? It worked without the drivers, as I said before, but wasn't the right resolution, so didn't look right. I also have a DV-I port on the card and using a dv-i - HDMI adapter, I tried to connect the TV through there and that was a no go as well. I also tried renaming the HDMI-1 channel to "PC" which people said to try, and that didn't work either. Any help would be greatly appreciated. And please don't tell me my TV or card is shot because they both work up to the log on screen. And the TV works fine with my XBox hooked up to the same HDMI port running 1080P. Here's the info about the card and TV: Card: ATI VisionTek HD4350 512MB DDR2 memory TV: Philips 32PFL6704D/F7 32" 120HZ Full 1080P Oh also, when I hook up the HDMI from my computer to another HDMI port on my TV (HDMI 2/3/4) or back to the original, it makes that weird noise like when you plug in a USB device or memory card, and then makes the other noise about 2 seconds later as if I disconnected whatever was plugged in. I can't see my desktop, so I don't know what it's saying or doing.
-
Answer:
I've seen this before on several systems that I've built. It is usually the resolution or refresh rate set too high for the TV to handle. Remember, that TV is not a computer monitor. Computer monitors have a signal they send to the computer, letting it know what its limits are. Your TV might or might not play well with a computer (reporting to the computer its resolution and refresh rate limits). Boot into safe mode: http://windows.microsoft.com/en-us/windo... ...then set the resolution and refresh rate below what you know your TV should display. For most TV's that are at least 720i HD resolution, set it to 1280 x 720, 60 HZ vertical refresh rate (found in the "advanced" section of the resolution settings). Once you do this, reboot the computer, it should work just fine after that. Once you get booted into Windows, you can then set the resolution for the maximum your TV can handle. For full 1080p HD TV's, the resolution is 1920 X 1080, at 59 or 60 Hz refresh rate. You should also find out if the TV manufacturer has put out a Windows7 driver for that TV. Some of the better TV companies have drivers for their HD models, but all they really do is tell the operating system what resolution and refresh rates the TV can display.
Phil at Yahoo! Answers Visit the source
Other answers
Try booting up in SAFE mode .... it's possible you set the resolution too high for the TV/Monitor to operate properly. In safe mode, set your reso lower and save, then reboot and see if the screen comes good. It's also possible in safe mode to roll back to your previous drivers if the resolution change doesn't work.
why not just use vga? and the problem might still be that your tv is on a unsoported resolution. i know my tv does that is its a resolution not supported by the tv.
Wow man, thats frustrating. Maybe try to use the previous driver for your card. this is definitely mind-boggling.
drsmaw
Don't know what bios you are using but I would check all settings there as the PCIe may have to be enabled in the bios. Yes I see that it does if the default is not enabled . I do see a frequency setting in the manual for that board http://dlcdnet.asus.com/pub/ASUS/mb/socket775/P5K-VM/e3279_p5k-vm.pdf Just thoughts.
Lil' Ugly
Related Q & A:
- How is this video card?Best solution by Yahoo! Answers
- Can I just out any video card in my computer?Best solution by Yahoo! Answers
- How to hook up my HDMI upconversion DVD player to my computer or monitor?Best solution by Yahoo! Answers
- How do I get my monitor to support my video card?Best solution by Yahoo! Answers
- No signal to monitor?Best solution by Yahoo! Answers
Just Added Q & A:
- How many active mobile subscribers are there in China?Best solution by Quora
- How to find the right vacation?Best solution by bookit.com
- How To Make Your Own Primer?Best solution by thekrazycouponlady.com
- How do you get the domain & range?Best solution by ChaCha
- How do you open pop up blockers?Best solution by Yahoo! Answers
For every problem there is a solution! Proved by Solucija.
-
Got an issue and looking for advice?
-
Ask Solucija to search every corner of the Web for help.
-
Get workable solutions and helpful tips in a moment.
Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.