Subscriber JPB wrote several times this week about troubles he was having with his computer monitor:
I have a question (though I think I know the answer). I use two monitors. My secondary one has developed a problem.
It takes 5 minutes for it to fire up in the morning. It flickers on and off and finally stabilizes and is good to go. Once its on–it works fine.
I suspect the end is near. But since it does work, is there something in the start up process that can be replaced (The gizmo the name of which has eluded me).
My answer to JPB was short, and not too sweet. It’s new monitor time, I’m afraid. I doubt that there’s any user-replaceable item in the monitor’s electronics.
He quickly responded that this was what he was afraid of.
Three days later, it was another email:
It finally died. Services are pending.
I did an autopsy on it and found several large circuit boards. This begs the question, why do we (especially is gamers) need expensive video cards?
Why can’t they be built into the monitor?
That’s an interesting question. The technical answer is all about computing power and bandwidth. The user answer is about cost and upgrading.
You may or may not have noticed, but the video card is always put in the fastest interface slot in the computer. There have even been special types of slots designed for video cards to get better bandwidth than the other slots (the AGP video slot in the days of PCI slots).
The reason for this is that there is a huge volume of data that has to be passed between the CPU and RAM to the video card. Then, the video card processes the data to create the pictures that ultimately will be displayed on the monitor.
For gamers, the video cards are constantly being update to faster and faster graphics processing units (GPU’s) and faster clocks (the graphics card runs at it’s own speed, independent of the computer’s speed) and faster and more graphics card memory.
Most other people don’t change their graphics cards except when buying a new computer, or if the card fails. Not so with dedicated gamers – some will upgrade their graphics cards a couple times a year in order to get a little faster graphics response or more realistic displays.
Although you could put the video card in the monitor, you couldn’t upgrade it easily – and people keep their monitors a long time. You couldn’t cool the graphics card easily. All graphics cards are heat generators with heat sinks, and often they have fans, too.
There are two physical problems that prevent putting the graphics card in the monitor. The graphics cables have fewer wires to carry signal than do the PCI-E slots. This means that the data transfer wouldn’t be as fast. The other factor is the longer distance of the cable versus the direct connection via the motherboard’s PCI-E slot. Again, this would have a very significant effect on the amount of data that could be transferred per second.
From a user point of view, putting the graphics cards in the monitors would tremendously increase the prices of the monitors – it would do very terrible things to price/volume curves, as the demand for any particular model would be much less.
The other user issue would be the inability to upgrade the video card. This wouldn’t be much of a problem for most users, but would definitely hamper the upgrade process for a gamer who liked to upgrade his video cards to the latest and greatest.