Analog TV Signals
A very long time ago, man invented the TV. They used a glass vacuum tube with a layer of phosphor, an electron cannon that could light up the phosphor, and a set of current controlled magnetic deflectors to steer the electron beam along the phosphor layer.
The video images would come in serially in fixed order: from left to right, and from top to bottom.
The horizontal deflector magnet would change in strength to guide the beam from left to right, while keeping the vertical deflector magnet constant. Upon reaching the end of the line, the horizontal deflector would rapidly move from right to left, and the vertical deflector would increase a little bit to move to the next line.
And once the bottom was reached, the vertical beam needed to move from the bottom back to the top.
Even though the beam could move right to left and bottom to top quickly, it still needed a little bit of time.
In addition, since there was just one analog signal coming in (over the air!), the receiver needed a way to determine when that signal was about to restart drawing the top of the screen, or when it was about to start drawing the next horizontal line.
And thus horizontal and vertical sync were born.
A short durating sync level indicated horizontal sync, a much longer (multiple line durations) sync indicated vertical sync. Simple!
Initially, TV signals were monochrome only, so the signal only needed encode gray intensity. The solution was simple: just make the intensity proportional to the voltage of the signal.
In the end, the signal was structured as follows: voltage levels from the lowest value to some intermediate value were considered to be sync. Anything else from above the intermediate value where considered actual visible signal.
The drawing below shows how this actually looked.
After while, color TV was invented. One way or the other, this color needed to be added to the existing encoding scheme while remaining backward compatible. The solution was to place the color components into a part of the luminance frequence spectrum, but chose some very clever parameters such that they would not be visible on a black and white TV. It would lead to far to go into the details.
When computers entered the scene, CRTs were still very much king, but since the source and the sink were now connected through a cable all the way, there was more variation possible in the way the signal could be transferred, often more straight forward to implement, or without the quality tradeoffs that were present in analog TV signals.
Stil, some early home computers, such as the Apple II, used the analog TV signal protocol, which allowed them to connect to existing TVs, but others switched to a simpler formats where individual color components and sync signals were carried over separate wires.
A good example the CGA video adapter with a DE-9 connector: it had 4 binary color components, red, green, blue, intensity, and a separate horizontal and vertical sync pin. EGA was the successor, mostly backward compatible to CGA, with some added color intensities.
The VGA format and connector were introduced by IBM in 1987. It was a massive improvement over what came before it.
Analog signals carrying RGB, instead of digital color components, allowed for a theoretically unlimited amount of color component values (but of course still subject to the limitions of the source.)
It still had separate VSYNC and HSYNC signals.
A DE-15 connector allowed for a number of side-bands signals that could be used for monitor capabilities discovery.
Depending on the cable quality, VGA could transport resolutions up to 2048x1536 @85Hz, which must have been unimaginable back in 1987 when 640x250 @70Hz was considered impressive.
In its first revision, those side-bands were little more than fixed value straps that could only encoded a couple of configurations, but in an update in 1994, those were replaced by DDC, digital data channel, which allowed near limitless amount of configuration data to be send back to the source.
After the fast iterations from CGA in 1981 to EGA in 1984 to VGA in 1987, the industry (and the monitor buying consumers!) were ready for a break: the VGA interface wasn't perfect, but it was a huge step forward compared to its predecessor.
It took 12 years for an alternative to enter the scene!