Computer Knowledge
Parts of Computer
Monitor
A monitor or a display is an electronic visual display for computers. The monitor comprises the display device, circuitry and an enclosure. The display device in modern monitors is typically a thin film transistor liquid crystal display (TFT-LCD) thin panel, while older monitors used a cathode ray tube (CRT) about as deep as the screen size
TYPES OF MONITOR
Cathode ray tube
The first computer monitors used cathode ray tubes (CRT). Prior to the advent of home computers in the late 1970s, CRTs – also known as video display terminals (VDT) – were commonly physically integrated with other portions of the system (e.g. the keyboard, the computer, etc.). The monitors were monochrome and the image quality was generally poor compared to current displays. The capabilities provided by the combination of the CRT and computer systems driving it have increased in capability progressively over the years in step with the general increase in the computer industry's technological capability. Color displays became more common in the late 1970's with the industry generally being lead by Apple, Inc. and Atari. Lagging several years behind Apple's 1977 introduction of color display capability in in the Apple II, IBM introduced, in 1981, the Color Graphics Adapter, which could display four colors with a resolution of 320 by 200 pixels, or it could produce 640 by 200 pixels with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of 640 by 350.[1] CRT technology remained dominant in the PC monitor market into the new millennium partly because it was cheaper to produce and offered viewing angles close to 180 degrees. |
Liquid crystal display
There are multiple technologies that have been used to implement liquid crystal displays (LCD). Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly, the same laptop would be offered with an assortment of display options at increasing price points: (active or passive) monochrome, passive color, or active matrix color (TFT). As volume and manufacturing capability have improved, the monochrome and passive color technologies were dropped from most product lines. The first standalone LCD displays appeared in the mid-1990s selling for high prices. As prices declined over a period of years they became more popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors was the Eizo L66 in the mid-1990s, the Apple Studio Display in 1998, and the Apple Cinema Display in 1999. In 2003, TFT-LCDs outsold CRTs for the first time, becoming the primary technology used for computer monitors.[2] The main advantages of LCDs over CRT displays are that LCDs consume less power, take up much less space, and are considerably lighter. The now common active matrix TFT-LCD technology also has less flickering than CRTs, which reduces eye strain.On the other hand, CRT monitors have superior contrast, have superior response time, are able to use multiple screen resolutions natively, and there is no discernible flicker if the refresh rate is set to a sufficiently high value. LCD monitors have now very high temporal accuracy and can be used for vision research.[5] |