It’s 2022, and buying a TV or monitor isn’t that easy… There are tons of acronyms and options, making it difficult to understand what you’re buying. Read on for some solid buying advice.
Table of Contents
Updated 11/3/2022: Fixed a couple of typos and made some minor wording changes.
Fruit Loop, Letter Soup
The first challenge to buying a new TV or monitor is the incredible number of acronyms you need to learn before you buy one. And, every time you buy one, somehow, there are yet even more acronyms.
K and P – Resolution
“K” means “thousand”, just like in the metric system.
In terms of television or monitor resolution, K is a generic term that means, “about 1,000 pixels across”.
There are two sets of standards, which makes things even more confusing.
- In the consumer space, monitors and televisions follow the ITU standards
- In the cinema / motion picture space, equipment follows the DCI standards
“P” refers to “Progressive” scanning, which is a fancy way of saying non-interlaced.
“Interlaced” means that odd and even scan lines are drawn in alternate frames. So if the first frame has only the odd scan lines drawn, the next frame will have only the even scan lines drawn. This interlaced format goes back to the early days of television, and there were only two interlaced formats used in North America:
- 480i, which is NTSC “Standard Definition”, or 320 x 480 (interlaced).
- 1080i, which was an early attempt to make 1080p (Full HD, or 1920 x 1080) compatible with the older, slower devices of that time that couldn’t handle the 60 frames per second frame rate. This was also a way to sell lower-end TVs at a higher price point by calling them “True HD” or something similar, with “1080i” somewhere next to it in a much smaller font.
Interlaced formats have long since faded in to obscurity, and all modern TVs and monitors are progressive, but we still use the “P” number to refer to the vertical resolution. “Progressive”, of course, only refers to TV resolution, since monitors have never been interlaced.
In summary, “K” means the horizontal resolution, “P” means the vertical resolution, and both terms are generic.
|Generic Term||ITU Standard||DCI Standard||No Standard||“P” Number|
1920 x 1080
2048 x 1080
2560 x 1440
|4K||4K UHD, UHD
3840 x 2160
4096 x 2160
|5K||5120 x 2160
5120 x 2880
|8K||8K UHD, UHD2
680 x 4320
8192 x 4320
W and UW – Widescreen
“W” means “Widescreen”
“U” and “UW” mean “Ultra Widescreen”
Before 16:9 was the mainstream standard, most TVs and monitors had an aspect ratio of 4:3, meaning, if the vertical resolution is y, the horizontal resolution is 4/3 • y. Since standard definition (4:3) is more square, any 16:9 format was seen as “widescreen”. During the same era, anything in the much wider 21:9 format was therefore called “ultra” widescreen.
As 16:9 became “normal”, calling it “widescreen” didn’t really make sense, so companies began dropping this designation sometime in the 2010’s. However, there is still 21:9, which WAS called ultra widescreen, which then became normal widescreen.
|Before mid-2010’s||“Normal”||Widescreen||Ultra Widescreen|
|Later 4:3 formats||UXGA: 1600 x 1200
QXGA: 2048 x 1536
|1080p formats||Full HD: 1920 x 1080||UWHD: 2560 x 1080|
|1440p formats||QHD: 2560 x 1440||UQHD: 3440 x 1440|
|2160p formats||UHD: 3840 x 2160||5K Wide: 5120 x 2160|
|5K||5K: 5120 x 2880|
|4320p||8K UHD: 7680 x 4320|
D and Q – Double and Quad
“D” means “Double” the WIDTH.
“Q” means “Quad” or quadruple the RESOLUTION.
If “D” or “DW” appears in the letter soup next to a monitor’s specification, generally it means “Double Wide”. So if a normal UHD (4K) monitor is 3840 x 2160, a “DWUHD” or “D4K” would be twice the physical width, and twice the horizontal resolution at 7680 x 2160.
“Q” doesn’t really mean anything. If Q appears in the letter soup, it just means that the resolution of this particular monitor is 4 times the resolution of some other standard. For example, QHD is 2560 pixels wide, which is 4 times 640, and 640 x 480 is the original VGA resolution.
What’s confusing is that width and resolution are independent. “Double” affects the size of the monitor, and “Quad” affects the picture quality, but neither of these have standardized meanings.
HD and UHD – High Definition
Originally, TV resolution was standardized at 320 x 480 (interlaced), also called 480i.
At that time, “HD” referred to anything greater in resolution to 480i. There were many semi-standard and non-standard video formats, such as 720 x 480p that were considered “HD”, but didn’t really conform to any industry-wide standard.
Eventually “HD” was standardized as 1280 x 720, also known as 720p, and “Full HD” was standardized as 1920 x 1080.
Nowadays, “HD” generally means 1080p, but you have to be careful. I got a good “Prime Day” deal a couple of years ago on an “HD” TV, only to find out that it was only 720p. I was going to return it, but I decided that since it was so inexpensive, I could keep it and use it to watch movies while I work in my office. And yes, it’s still in the box, sitting next to my desk, as of this writing. :-/
So ALWAYS be sure to check the specs. Like I didn’t.
“UHD” is fairly standardized to mean 3840 x 2160, but ALWAYS check the specs.
QHD is a half-step between Full HD and UHD, at 2560 x 1440.
VGA, DVI, and HDMI – Input Ports
VGA is the older computer monitor standard used throughout the 90’s and early 2000’s.
VGA is an analog-only signal standard, and was eventually replaced with DVI, whose standard can include both analog and digital signals.
By the late 2000’s, DVI was replaced with HDMI, which supports digital video as well as audio on a single cable.
Generally speaking, you want more HDMI ports. You need one for each device you want to connect to your TV or monitor, so having just 1 or 2 can end up being quite limiting. For example, my TV has 4, and 3 of them are currently used.
Having a VGA or DVI port is a nice option to have, but not really necessary, because you can get conversion cables to convert between any of these connectors. For example, you can get a cable that’s VGA on one end and HDMI on the other end for about $10.
Pro Tip: If you have a device that you know isn’t really going to benefit from HDMI, using your TV’s VGA or DVI port is a clever way to keep from using up one of your precious HDMI ports. For example, if you have an older computer or game system, hook it up to the DVI port.
Pro Tip: Some monitors have only one HDMI port, but have a VGA or DVI option for older equipment. If you have two computers, you can use both ports, and the monitor should have a button on the front to switch between them. This is a great way to use a monitor as a poor man’s “console switch” for equipment that you use infrequently, but have the occasional need for a monitor.
|Digital Only||Yes||HDMI provides the best option for connecting audio and video, even at the highest resolutions.
Most devices, televisions, and monitors have female HDMI ports, and most cables are male-to-male.
HDMI ports are only slightly wider than USB ports, and can often be mistaken for one.
|No.||DVI was designed to carry the digital signals of newer devices, and also bridge the gap for older analog devices. For example, DVI can natively carry analog VGA signals.
Devices, monitors, and TVs usually have the female connector, and cables are usually male-to-male.
There are three types of DVI cables:
Some older devices might try to enforce the use of analog or digital by having missing pins or pinholes. For example, DVI-A uses many fewer pins than DVI-D or DVI-I, and if a device’s DVI port is missing pinholes for DVI-D, then only a DVI-A cable can be used.
Pro Tip: As with VGA cables (see below) DO NOT over-tighten the thumb screws.
||Analog Only||No.||Since VGA is analog-only, it can support a max resolution of about Full HD (1920 x 1080), but image quality at higher resolutions requires a much higher frequency, necessitating a higher-quality, better-shielded, shorter cable.
Most devices that have a VGA port have the female side, but this isn’t necessarily a requirement. Most televisions and monitors with a VGA port also have the female side, but some monitors can have a male connector. Most cables are male-to-male, but male monitors usually require a male-to-female cable. Having the male pins on the cable rather than on the monitor or device eventually became a standard convention, because a bent pin on a cable is much easier to fix than bent pin on a device – you simply replace the cable.
The VGA connector is a 15-pin, D-sub (“D” shaped, sub-miniature), also known as “HD15”, which is the same size and shape as a DB9, but has three rows of 5 pins, rather than two rows in a 5/4 configuration. Like all D-sub connectors, it uses two thumb screws to hold it in place when connected.
Pro Tip: Don’t over-tighten the thumb screws. You just need to secure the connection from wiggling loose – it’s not load-bearing! Turn the screw until it just starts to tighten, and then STOP. That’s plenty of force. If you over-tighten D-sub connectors and then try to unscrew them later, you can end up destroying the connector or damaging the equipment trying to get the thumb screw loose.
LCD and OLED – Display Technology
Here is a brief history of display technologies.
- Cathode Ray Tube (CRT) was the mainstream technology used from the dawn of television in the 1940’s through the 2000’s. The successor, LCD, first outsold CRT in 2008, and they were completely obsolete by 2010 when the last consumer CRTs were manufactured.
- Early CRTs were black and white – an electron beam sweeps the screen, from top to bottom, one line at a time, from left to right. As it sweeps, the beam’s intensity is controlled by the signal it’s receiving from the tuner or other device. The beam sweeps across a grid of phosphorus dots, arranged in a 320 x 480 grid, energizing each one with a brightness that corresponds to the signal’s intensity. Each phosphorus dot then glows more or less intensely for a couple of milliseconds, and by arranging the intensity of all of the dots, we have an image. By repeating the process 30 times per second, we have a moving image. As mentioned, CRTs are interlaced, which means that only 240 lines are energized on each sweep, alternating odd and even scan lines, but this happens so fast that you can’t really see it.
- Color CRTs work the same way, but there are three electron beams, corresponding to the three primary light colors, red, green, and blue. Instead of one phosphor dot at each location, there are three – one each of red, green, and blue. A grid mask controls which beam hits which dots, and by varying the intensity of each beam, we can now draw color images.
- There were a number of technologies related to CRTs during this time, as well.
- The cathode-ray oscilloscope was invented in the very late 1800’s. This produces a visual representation of an electronic signal by steering an electron beam around a screen coated in phosphor. Wherever the beam hits, the phosphor is energized, leaving a green trace.
- Early radar CRTs was based on the oscilloscope, and similar to a television CRT, but swept in circles instead of line by line.
- Early graphics displays used vectors instead of scan lines, allowing the electron beam to draw lines at any location on the screen. An image was produced by drawing a set of lines, and even text could be produced using a “line-draw” font. If you’ve ever played an old “Star Castle” or “Asteroids” arcade game, those used vector displays instead of raster displays. Because of the crisp graphics and high resolution, these were often used for early CAD solutions.
- In 1966, Ralph Baer invented the first video game, “Ping Pong”, which fed a modified signal to an oscilloscope display. Later, this became the basis for Atari Pong, which basically started the arcade video game boom in the 1970’s.
- Plasma displays offer a high contrast ratio and a long useful life, but they are expensive and power-hungry. Initially, monochrome plasma displays were used in portable computer equipment, but by the late 1990’s, reasonably-priced plasma TVs were commercially-available, and they continued to be produced through the early 2010’s, at which point, plasma was no longer commercially competitive to LCD and OLED. In a plasma display, each pixel is a tiny neon light that gets energized by high voltage. In color plasma displays, each pixel has three sub-cells, one each for red, green, and blue, with the corresponding colored phosphor coating in front of the cell. When the neon is energized, it energizes the phosphor coating. Varying the voltage of each sub-cell varies the intensity of each color, which is how an image is drawn.
- Originally used in calculators and watches because of its low power consumption, Liquid Crystal Display (LCD) has been available in commercial devices since the 1970’s. The first LCD displays were commercially available in the early 1980’s, in specialized products such as Sony’s wrist watch TV. By the late 1980’s, the first portable computers with LCD displays started to appear. By the late 1990’s, the first mainstream LCD televisions and computer monitors started to appear. LCD outsold CRT for the first time in 2008, and continues to be one of the market’s dominant technologies.
- At a high level, applying an external voltage to the liquid crystal causes it to change its orientation, blocking light from passing through. LCDs have a light that shines from behind the display, through a polarizing filter, then through the liquid crystal matrix. The resulting light is either aligned with the front polarizing filter, where it is allowed to pass, or its orientation is changed by the crystals, where it is blocked by the front polarizing filter. Monochrome LCDs have a matrix of “dots” that are either “on” or “off”. If a dot is “on”, it blocks the light at that point, and the dots are used to draw an image.
- Grayscale LCD works the same, but the crystals are stacked in layers, and have differing configurations based on the voltage level of the input signal. Thus, a pixel at a lower voltage is lighter, and a higher voltage is darker.
- Color LCDs work like graycale LCDs, but they have three sets of pixels – one for each of the three primary colors: red, green, and blue. Generally, all the pixels use a single backlight, but each pixel has an appropriate (red, green, or blue) filter in addition to the front polarizing filter, that only allows a specific light color to pass.
- Modern LCDs are “Active Matrix”, which means that each pixel is individually-addressable via a network of embedded transistors – one for each pixel. This improves the response time and provides a sharper image.
- Over the years, the technology of the backlight has changed. Early on, the LCD was back-lit by fluorescent bulbs, because they are more efficient than incandescent, and cover the full color spectrum. These were prone to having a relatively short lifespan, and were still quite power-hungry. By the 2010’s the fluorescent backlight was replaced by LED, which is far superior in both regards.
- Light Emitting Diode (LED) displays are really LED-LCD, because they use LCD technology to draw the pixels. The first LED-lit LCDs were edge-lit, like conventional LCDs. The first back-lit LED displays were criticized for not having a true black. The displays that followed use a matrix of white LEDs as the backlight, and the image processor constantly tunes the intensity of each LED based on the image luminescence of that particular region, known as “local dimming”. All modern LED displays use some form of local dimming.
- Organic Light Emitting Diode (OLED) displays first hit the market in the mid-2000’s, but the cost was very high, and the expected lifespan was relatively low. However, in the early 2010’s the smart phone market drove OLED innovation to the point where the cost of OLED displays are competitive, and the lifespans are reasonable. Along with LCD, OLED is one of the two current, mainstream display technologies.
- OLED operates by sending voltage through cells containing a biolumenescent compound. Cells are arranged in a pixel grid on a substrate (backing material) that’s usually glass or plastic. Each pixel has three cells which use red, green, and blue organic dyes to produce the corresponding color when energized.
- OLED offers the best picture quality and the lowest power consumption, but the picture quality degrades over time because the organic compounds break down over time.
- Because the substrate material can be plastic, this allows for curved, flexible, and even foldable displays.
Why did we need to cover all of this? To demonstrate that, despite what display manufacturers want you to believe, there have been very few fundamental advancements in display technology.
The real story is that every couple of years, all of the larger manufacturers come up with either a new manufacturing process, or they make the dots smaller, or they embed more LEDs or whatever in to their latest display, making the picture look incrementally better than the previous display. Then, they patent this “revolutionary” new design (which isn’t revolutionary) and slap a brand name on it.
They do this, actually for a few reasons:
- If a competitor has an otherwise identical product without XYZ technology, it makes comparing the two products difficult.
- NOW, with XYZ technology pushes consumers to the latest models, which retail at a higher price point
- They can license XYZ technology to second-tier manufacturers
So brand names like “QDOT” or “Micro Quantum” or “Quantum Quantum” don’t mean anything. It means you get a slightly better picture than the same display from the same manufacturer without the brand names.
For example, Samsung’s “Quantum Dot” QLED technology sounds impressive! But, it’s just another variation of LED-driven LCD, made possible by a new manufacturing process. But “quantum” sounds high-tech, right? And “QLED” – “Q” comes after “O” in the alphabet, so “QLED” must be some completely new display technology, right? No to both.
…And, Everything Else
Words like “True Motion” and “Super Motion” and “Blast Processing” are “WOW” words that are completely made up by the marketing department.
Each manufacturer has slightly-proprietary image processing software or hardware (or both), designed to make the picture slightly sharper, or motion slightly smoother.
Each new iteration goes through the marketing department, and becomes “Blast Processing”, and gets slapped on the box.
Monitor vs Television
Let’s define the two, and compare them.
- A Monitor is used primarily for video.
- Most monitors don’t support audio, and do very little image processing.
- Monitors are usually used with computers that have a separate audio output, or where audio isn’t required, such as the case with camera systems, security displays, and kiosks.
- Without an external source of input, monitors can’t really display anything.
- Monitors are designed with high refresh rates and low latency, which is conducive to reading text and playing games.
- A Television (TV) supports both audio and video, and has a wide variety of inputs and sources.
- TVs have a tuner, which is capable of receiving Over The Air (OTA) digital broadcasts, as well as integrating with ATSC digital cable or satellite systems without the need for a separate box.
- TVs usually have special hardware and software geared toward image and motion-image enhancement, providing optimized audio and video for specific scenarios, such as movies or sports, but at the cost of latency.
- TVs aren’t usually good for gaming, unless they support a low-latency mode (usually called a “gaming” mode) and a high refresh rate.
- “Smart” TVs have a built-in computer – think of it as a big smart phone – that allows the user to load a variety of applications and games directly on the TV itself. Users can watch Netflix or play movies without the need for a computer or other source.
- TVs usually have low-resolution inputs, such as composite or component, allowing backward-compatibility for older devices such as VCRs, DVD players, and game systems. Most modern TVs perform “upconverting”, which means that the TV performs image processing to scale up and enhance the resolution from an older device, making the images appear sharper and smoother using the TV’s native resolution.
- In addition to having inputs, most modern TVs, especially higher-end TVs, support outputs as well, in order to integrate with external sound systems, or to relay video to an external (secondary) monitor.
- Televisions are usually twice the size (four times the viewing area) of monitors, where the largest common monitor size is just under the smallest common TV size. As of this writing, the largest, common monitor size is 32″ (inches), while common TV sizes currently range from 43″ to 65″.
In summary, a monitor is used for viewing video. A TV is like a big monitor, with audio, a tuner, image processing, extra inputs, and outputs. Monitors usually have higher refresh rates and better contrast ratios that are geared toward reading text and gaming.
From a cost perspective, smart shopping can yield a cost:size ratio of $9 per inch for both TVs and monitors. At first glance, the TV has more features, and seams like a better deal, but the monitor is optimized for reading and gaming, where the TV is optimized for smooth video. Cost ratios can go up to nearly double, depending on size and features.
Here is a side-by-side comparison:
|Size (Currently)||Usually < 40″||Usually > 40″|
|WiFi||No.||Smart TV Only|
|Resolution||HD, QHD, WQHD, UHD (4K), WUHD, 5K
A few 8K UHD monitors are available
“W” means Ultra-Wide, or 21:9 aspect ratio
|UHD (4K), 8K UHD. A few WUHD and “Wide” 8K UHD TVs are available.|
|Aspect Ratio||16:9, 17:9, 21:9, other “wide” formats||16:9 and (rarely) 21:9|
|Refresh Rate||75Hz is typical. Higher-end monitors have a faster refresh of 120Hz or 144Hz, and a few go all the way up to 240Hz||60Hz is typical. Higher-end TVs go up to 120Hz, but anything beyond 120Hz is rare. In North America, TV refresh rates are usually multiples of 60 because of the 60Hz power frequency.|
|Inputs||Usually 2 or 3 – combination of HDMI plus either DVI or VGA (D-sub)||Usually 4 or more, including HDMI, DVI, and either component or composite (or both)|
|Tuner||None.||ATSC digital tuner for OTA digital broadcasts, and integrates with some cable / satellite providers.|
|USB||No. Some monitors offer USB pass-through, acting as a USB hub when connected to a host such as a computer or laptop.||Smart TV Only|
|Apps / Games||No.||Smart TV Only|
|$9 / diagonal inch, up to $15 / inch, depending on size, features, and resolution.||$9 / diagonal inch, up to $18 per diagonal inch, depending on size, features, and resolution|
|Image Processing / Enhancement||Very little, if any. Monitors are optimized for low-latency and high contrast. Some high-end monitors perform image optimization aimed at gaming.||Standard options include “upconversion” – increasing the resolution and smoothing images from low-resolution sources, such as VCRs, DVD players, and older video game consoles.
Higher-end TVs perform image and motion processing, to make movies look smoother, or to increase the contrast ratio for sports and gaming.
|Outputs||Usually, None. Some monitors provide a pass-through, for example to connect computer speakers.||Most TVs can connect to an external sound output device, such as a sound bar or receiver, with connectivity options including fiber, coax, or Bluetooth.
Some TVs provide auxiliary outputs for video, in order to connect a secondary monitor or to daisy-chain a second TV.
Despite the differences, there are situations where the use-cases overlap.
- In a bedroom or dorm room, a large monitor can do double-duty when combined with an external audio system and a tablet or a laptop. Likewise, a small smart TV can act as a monitor for a laptop or tablet, but also allows your to watch Netflix, even while you play a game on your laptop.
- Computers and laptops can usually connect to a TV as a second monitor. Most TVs have a low-latency mode (with most of the image processing disabled), allowing you to play games using the computer as if it were a game console, or allowing you to play videos from your computer.
- Televisions are a good option for signs and other public displays, because monitors are typically too small.
- Road Warrior pro-tip: Most hotels have an HD TV in every room. If you bring your laptop and an HDMI cable, you can connect the TV as a second monitor. This is a great resource if you’re working, or you can watch movies for free if not.
In summary, know what you’re buying.
- Monitors are smaller, and they just show you stuff, but they are optimized for gaming and reading text.
- TVs are larger, have audio, and are more general-purpose.
TV: To Smart, or NOT too Smart?
Smart TVs are like bit smart phones. You can install apps and games, browse the web, or even play movies or music from a memory stick, using the TV’s built-in operating system.
Although most mainstream TVs on the market today are smart, there are still conventional “not smart” TVs available as well.
Although this used to be a very expensive option, comparing current offerings from a cost perspective, “smart” functionality adds about $50 to $100 to the price.
There are several common “smart platforms” that vendors can choose to integrate, or they might have a proprietary one.
|Google Smart TV||Android is one of the most popular smartphone / tablet platforms, with a wide variety of features and functionality. Developers can license the platform with Google, or clone it for free.||Licensed Platforms have access to the Google “Play” store, with millions of applications and a wide variety of content.
Licensed Platforms also typically include access to “OK Google”, Google’s intelligent assistant, and integration with a suite of Google and third-party smart home devices.
Long release and support cycles means a much longer supported life for the device, and usually guarantees that applications will remain supported for a much longer time – perhaps 5 years or longer.
|Unlicensed platforms may be restricted to a secondary or proprietary app store, with a very limited ecosystem. For example, Google apps and platforms such as Youtube might be excluded.
Beware of Google, who uses “voice recognition, machine learning, and Knowledge Graph to determine what apps and content to recommend”. Yikes.
|Amazon Fire OS||Amazon’s Fire OS is a modified version of Google’s Android, and runs on Amazon’s suite of devices, including tablets and televisions.||Android is the second-largest Android ecosystem, next to Google.
Amazon devices have access to “Alexa”, Amazon’s virtual assistant, and a suite of Amazon and third-party smart home devices.
Like Google, Amazon has a much longer support cycle, which means that devices have a much longer useful lifespan. I have one of the original Fire Sticks, and I can still (as of this writing) watch Netflix on it, despite the fact that the device is 8 years old.
|Users are locked in to the Amazon ecosystem. Although there is support for a wide variety of 3rd-party IoT devices, there is little to no interoperability with the Google ecosystem.
No access to the Chrome browser, nor most Google applications. A few Google applications such as Youtube are published on the Amazon marketplace (their version of “Play”).
Amazon uses advertising to supplement hardware costs, so if you buy a Fire tablet or Fire TV, you will be subjected to mandatory advertising.
|Roku||Roku started off life as a “Netflix viewer” so that you could watch Netflix on a TV, when Netflix first started streaming. I think the first Roku devices were based on Arch Linux, but I’m not 100% sure.||Like other platforms, there is a “store” and you can install apps called “channels”.
Roku claims to be the number one Streaming OS, but on the other hand, they are also probably the ONLY OS that’s dedicated to streaming, so I’m not sure if this is a Pro or a Con.
|The Roku marketplace is small compared to Google or Amazon.
The Roku device itself doesn’t run apps natively. Instead, a “channel” is simply a hosted library of curated content, and the channel simply tells the Roku how to access it. In essence, everything you view on the Roku is streamed from somewhere else.
Because Roku doesn’t really run “apps”, the marketplace is limited to content.
|(LG) WebOS||WebOS was originally an open platform, but was apparently bought by LG. However, there is a forked version of WebOS called Open WebOS.||WebOS is built in to all LG smart TVs
WebOS has a fairly decent update and support cycle.
Like Android and Fire OS, WebOS can run real apps, including native, compiled apps.
|WebOS has a much smaller marketplace than Android and Amazon.
WebOS is technically proprietary. However, WebOS is theoretically cross-vendor because of Open WebOS. Unfortunately, LG disables sideloading at the factory, which means that, although developers can write apps for Open WebOS, they would still need to publish them on the LG store to make them available to LG smart devices.
|Proprietary||There are also a number of proprietary platforms. Although these were common in the early days of Smart TVs, they are less common today. For example, Toshiba, Sony, Samsung, and LG each had their own proprietary platform early on, but now they all run Android, Fire OS, or WebOS (LG).||Not really. You can always replace the “smart” functionality by plugging in a Fire Stick or a Chromecast…||Proprietary platforms tend to have short development cycles, which means frequent updates, and extremely short support cycles, which means that apps can quickly become unsupported. Therefore, the device’s useful life is very short.
3rd-party developers are going to be much less likely to want to spend the cost and effort to re-factor or port their application to a proprietary platform, and conversely, the more proprietary platforms that exist requires an ever-increasing effort to port them and then keep them updated on multiple platforms and marketplaces. Comparatively, it’s relatively easy to develop a single app for Android, which can be released on Google Play, and the same application with little to no modification can also be released on the Amazon marketplace. This means that very few developers will be attracted to the platform, which keeps the marketplace and ecosystem relatively small.
There are two major down sides to buying a Smart TV, and basically they both converge to a single factor: SRAM.
When each device was designed, it was designed with a specific amount of storage, also called static RAM or SRAM. When you update the platform or download any applications, these are stored in SRAM.
Unfortunately, it’s inevitable that over time, the platform itself receives updates that either fix things or add functionality, and the newer versions are always larger, which means that performing the update uses more SRAM storage.
The same is true for each application you download – every time the application is updated, new features are added, but the resulting application is larger and requires more storage.
Down side #1: Eventually, the platform will no longer be updated.
- New security vulnerabilities are continuously being discovered, but if your smart TV isn’t being updated, you won’t get the fixes. So, if there is a platform-wide vulnerability that could potentially open up your network to hackers, like the Log4J vulnerability, there’s no way for you to fix it.
- Bugs won’t be fixed, but more importantly, the platform itself will never be updated for newer standards, such as newer encryption and compression algorithms. As the rest of the internet moves away from these older standards, the platform becomes less functional.
- Eventually, 3rd-party developers get tired of having to maintain a much older codebase that’s compatible with the older platform, which brings us to Down side #2…
Down side #2: Eventually, 3rd-party developers will no longer update their apps.
- As the developers want to add new features, these are limited by the smaller footprint and lack of functionality of older devices.
- Eventually, evolving technologies and standards, especially for large platforms such as Netflix, means that older devices that are no longer being updated, are no longer compatible, and are therefore not supportable.
- Newer applications require more CPU and memory, and can’t be supported on older devices that simply don’t have the horesepower.
Anecdote: I had a Vizio smart TV, purchased in 2011. Back then, every platform and marketplace was proprietary, as was the case with Vizio. Despite this, you could get Netflix, Prime, and Youtube apps, and that worked out quite nicely, despite the fact that the built-in browser was crap. By 2016, the Netflix and Prime apps no longer worked. If you ran them, you simply got a message stating, “this platform is no longer supported”. Eventually, the Youtube app just disappeared – probably because it was eventually removed from the Vizio marketplace because Youtube didn’t want to keep supporting a marketplace with a declining market share.
Annoyance: Remote Control as an Input Device
More of an annoyance than a down side, using the remote control to input text really sucks, especially when you’re trying to log in to your Netflix or Youtube account.
Most TVs support Bluetooth or USB “HID” (Human Interface Device), meaning you can connect a Bluetooth or USB keyboard and mouse. One particularly-useful device is the Rii I8s, which is a small-form-factor USB chicklet keyboard with a built-in track pad (mouse). I have several of these (mix of I8S and the older X8), and they have all sorts of uses, and they even make half-decent game controllers for tablets and smart phones.
Although Rii also makes bluetooth devices, this particular one uses a wireless USB dongle that plugs in to the TV’s USB port.
This is the best option I’ve found, but any wireless or Bluetooth device designed for use with a tablet or laptop should work just fine.
Completely unrelated anecdote:
I always travel with a Rii, and when you have one in your carry bag, you tend to start to notice that all sorts of devices have USB ports. I also always travel with a “USB kit” that has converters, cables, and connectors for everything, so in theory, you could connect it to anything with a USB port.
You would be amazed at the number of kiosks and other devices that are assumed to be secure, because whoever designed them assumed that no one carries around a keyboard.
ALWAYS use your powers for GOOD, boys and girls, and never for evil.
Meanwhile, to the people who design smart devices found out in public places, either disable the USB ports or make sure they are locked down to specific device IDs.
I’ve seen kiosks, vending machines, cash registers, smart TVs used as signs, and even elevators with USB ports that are probably not secured properly…
Smart Sticks / Streaming Sticks
A “smart stick” or “streaming stick” is a device about the size of a USB thumb drive, that plugs directly in to the TV’s HDMI port, and provides smart TV functionality.
This is a great way to “re-smart-ify” an older smart TV, or a great add-on to an older non-smart TV, to extend the life of these older devices.
These usually come with a separate remote, but any Android-based, Bluetooth-capable device should let you pair with a Bluetooth keyboard and mouse. I know this works with the Amazon Fire Stick, but I haven’t tried others. Unfortunately, streaming sticks can’t use a wireless USB input device such as the Rii I8S, because there is no USB port on the device itself, but Rii makes small-form-factor Bluetooth keyboards as well.
Most of these devices are in the $50 range.
|Smart / Streaming Stick||Description||Pros||Cons|
|Fire Stick||Amazon was first on the scene with a true smart stick. The Fire Stick runs Fire OS, and is a great option. I have two Fire Sticks, one of which is 8 years old, as of this writing, and still works just fine. It’s a little SLOW, but it works just fine.||Basically, this makes any TV in to an Amazon Fire OS TV.
As of this writing, the Fire Stick is by far the best option.
|None that I’ve found.|
|Chromecast||Made by Google, the Chromecast is ONLY a streaming device. The idea is that you “screen cast” from your phone or Chrome browser to your TV via the Chromecast device.||Simple to use, and flexible.
Widely supported – many apps support Chromecast, or you can cast just about anything you can view in the Chrome browser.
|The gen 1 device initially worked great, but through successive updates, it got to the point where it was completely unreliable. Watch out for those updates from Google.|
|Chromecast with Google TV||Google TV is the newer version of Android TV, which is… Android… for your TV. “Chromecast with Google TV” devices are full-fledges smart sticks, similar to the Amazon Fire Stick.||Full smart TV functionality||Beware of Google, who uses “voice recognition, machine learning, and Knowledge Graph to determine what apps and content to recommend”. Yikes.|
|Roku Streaming Stick||Roku device which is the size of a streaming stick.||Roku. In a stick.||Roku. In a stick.|
Disabling Smart Features
There are some very good reasons why you might not want smart features on your TV. Most come with “smart assistant” functionality that always listens to you, and some platforms such as Google store every bit of data they can gather from you, including how you use your brand new smart TV.
Some smart TVs have the ability to simply disable smart features, turning your smart TV in to a dumb TV.
Another good way to block smart features is to simply refuse to connect it to WiFi. Without the internet, they can’t upload your personal data.
However, some smart TVs might require internet access, even for basic functionality. So if you DO plan to disable smart features, make sure the TV at least functions properly without them. You should still be able to switch inputs, watch OTA (Over The Air) broadcasts, and configure the TV itself – for example, configuring audio and video options. If the TV doesn’t work properly without internet access don’t buy it. But, you’re always connected, right? The internet is always on, right? Until your internet goes down, and you can’t even watch TV.
The Bottom Line on Smart TVs
- Most TVs available today have smart functionality, and finding one without it is less of a mainstream option.
- Smart functionality adds $50 to $100 to the price tag
- Steer toward Amazon, Google, or WebOS (LG) rather than a proprietary platform
- A smart stick, such as Amazon’s Fire Stick, is a great option to add smart functionality to an older TV. As of this writing, you can get a “Fire Stick Lite”, with remote, for $30.
- If you don’t want smart functionality, make sure the TV allows you to disable it, and functions properly without it.
Display Size Is Probably the Biggest Decision
Display size drastically affects cost.
We can obtain a cost-to-size ratio by dividing the price by the diagonal size, and as of this writing, the mainstream cost ratio for both monitors and televisions is about $9 per diagonal inch.
By comparing the cost of the same television model of different sizes, we get a clear picture of how display size affects cost. In this case, here are 6 versions of the same Samsung QLED 4K.
The best deals are to be found where the slope of the line is most horizontal. In this case, the best deals are the 50″ and 55″, with 65″ close behind them. Stated another way, the cost and size are more or less proportional between the 50″ and 55″ because the cost-to-size ratios are nearly identical, while 65″ is a small step up, but still not outside the realm of disproportionate. From this, we can assume that these three sizes are the most popular, and with higher volume, the prices are more consistent. Also, the more popular sizes are more likely to go on sale.
On the low end, you can get a good deal on a 43″, but… it’s only a 43″. So if you need a TV for an office or dorm room, you can get a really good deal, but it may not be large enough for your living room or “man cave”. Also, consider that for only $75 more, you can get a TV that’s 16% larger, so the extra cost might be worth it.
On the other hand, the larger 75″ and 85″ models cost significantly more per diagonal inch. Compared to the 65″, the 75″ is about 30% more, which means that it could go on sale for 20% off, and you’d still be paying more per diagonal inch than the 65″. The 80″ is nearly 100% more per diagonal inch.
For the larger sizes, it’s clear that you’re paying a premium.
Looking at monitors, I found something surprising. There are fewer sizes available, but more variability in terms of form factor and resolution. You can get a TV in 4K and maybe 8K, and you can get one in 6 different sizes. In contrast, there are only 3 mainstream monitor sizes, but you can get them in 21:9 wide-aspect, double-wide, and a plethora of resolutions from Full HD up to 5K.
Looking at three otherwise identical QHD monitors, it’s easy to see the same $9 per diagonal inch cost-to-size ratio.
However, in the case of monitors, wide aspect ratios or options like “Blast Processing” (nonsense options) can cause the cost-to-size ratio to jump dramatically. The most expensive QHD monitors in the same size ranges were nearly $15 per diagonal inch! But hey, they have “Blast Processing”, right? Remember that anything other than “OLED” or “LED-LCD” used to describe the display technology was made up by the manufacturer.
One note is that the smaller 24″ is a slightly better deal, probably because it’s in less demand.
- Divide cost by diagonal size in order to gauge the true cost of the display.
- As of this writing, you shouldn’t be paying much more than around $9 per diagonal inch.
- Stick to mainstream options, although you might be able to find a good deal on the smallest display sizes.
Resolution – Picture Quality
Resolution refers to the number of pixels available to draw an image. More pixels means better image quality.
“Standard Definition” televisions in North America were based on the NTSC standard (PAL in Europe), which specifies 320 pixels horizontally, by 480 vertically, with vertical interlacing (each frame alternates between odd and even scan lines), also known as 480i.
The first personal computers connected to, and used the television as a display. Companies such as Commodore and IBM introduced PCs with dedicated monitors, allowing for sharper text, but not really designed for graphics. However, IBM PC users could add an IBM CGA graphics card and a CGA monitor, providing glorious 4-color graphics using one of two color pallets at an amazing resolution of 320 x 200. As PC graphics evolved, EGA and then VGA resolutions far surpassed 480i, with VGA being the dominant computer graphics standard from its release in 1987 in to the early 1990’s. Eventually VGA’s 640 x 480 resolution became the expected minimum standard for all PC software.
Also by the early 1990’s, a number of third parties offered graphics cards at a variety of resolutions in excess of VGA, most conforming to the standard 4:3 aspect ratio, collectively called “Super VGA” or SVGA. Two of the more popular (and therefore, somewhat standard) resolutions were 800 x 600 and 1024 x 768.
SVGA was considered mainstream in to the early 2000’s, but was eventually superseded by by “XGA” which is a loose term for anything 1024 across and beyond. Common resolutions were 1280 x 1024, 1280 x 800, and 1600 x 1200.
With the digital cut-over in 2009, wide-aspect (16:9) televisions became mainstream, and computer monitors followed. In the late 2000’s and early 2010’s, there were a variety of computer monitor shapes and sizes, before becoming more or less standardized on HD resolutions and aspect ratios by the mid-2010’s.
Televisions standardized on HD (1280 x 720, or 720p) and “Full HD” (1920 x 1080, or 1080i / 1080p), and mainstream computer monitors supported these in addition to a plethora of in-between standards, such as WXGA (1366 x 768), which was very popular on laptops at the time.
Beyond Full HD, display resolutions for both TVs and computer monitors are fairly standard, mostly because there is a high degree of convergence between the two. A manufacturer can make the same panel in 9 different sizes, sell the smallest 3 as computer monitors, and the larger 6 as televisions.
TV resolutions are fairly standard: Full HD (2K), UHD (4K), and 8K UHD (sometimes called UHD2).
Computer monitors come in all of these resolutions, but also QHD (2560 x 1440) is a very popular option right now, and 5K (5120 x 2880 and its variants). In addition to normal (16:9) aspect ratios, there are a variety of wide formats available as well. For example, WQHD is a wide-screen (ultra wide) 21:9 format that extends QHD horizontally by 33% (3440 x 1440).
Obviously, resolution also affects cost.
Looking at three otherwise identical TVs from the same manufacturer at three different resolutions, it’s easy to see that, first of all, even finding both a 2K and 8K model from the same manufacturer is difficult. There are plenty of brand new 2K TVs out there, but none from a manufacturer who also sells an 8K, and vice-versa.
Comparing the same 65″ TV in both 4K (mainstream) and 8K, the cost is way more than double.
In terms of picture quality, there is a huge jump from 480i to Full HD (1080p), which makes sense because Full HD is 6 times the horizontal and 2.25 times the vertical (really 4.5 for 480i and only 2.25 for 480p), that’s 13.5 times the number of pixels.
However, UHD (4K) is only twice the vertical and twice the horizontal resolution of 1080p, or only 4 times the number of pixels. So UHD is an incremental jump, with noticeably-better picture quality, but it’s not as BIG of an improvement as the jump from 480i (or p) to 1080p.
The jump from 4K to 8K is the same (4 times), and again, the picture quality is incrementally better. However, there is a limit to the level of detail that most people can even see. If I took two otherwise identical 65″ TVs, one a 4K and one an 8K, and put them side by side, I don’t think most people could tell the difference without getting extremely close to the screen. So the real question is: Are you going to notice the difference from across the room, sitting on your couch?
Size, obviously, plays in to picture quality as well. A 43″ TV at 1080p will have a better picture than a 60″ TV at the same resolution, because the pixels on the 60″ TV are 40% larger! However, the same 60″ TV at 4K would have pixels 30% smaller than even the 43″ TV, and therefore the picture quality would be 30% better (all things being equal) than the 43″ TV, and 100% better than the 60″ 1080p.
Note: Pixel sizes are listed in thousandths of an inch, and pixels are vaguely square.
To put this in to perspective, a typical business card is about 14 thousandths of an inch (0.014″) THICK, which is about the WIDTH and HEIGHT of a single pixel on a 60″ 4K TV. So, are you going to notice the difference between the thickness of a business card compared to half the thickness? Especially at a distance? Probably not. Is it worth 2.5 times the cost? Probably not.
Aspect Ratio – Field of View
Aspect ratio is the ratio of the width of the screen to its height. Standard definition was 4:3, meaning the height was 3/4 of the width. Most standard displays today are 16:9, which means the height is 9/16 of the width. Although not common in TVs, 21:9 “wide” (formerly ultra-wide) aspect ratios are available as well.
TVs are measured by the diagonal, but aspect ratio affects the field of view and relative picture size.
Let’s say we view a newscast, where the reporter is standing in front of some trees, framed so that they are horizontally-centered, occupying 2/3 of the frame vertically.
If we consider three 50″ TVs in each of three different formats, the difference becomes clear. All three TVs measure 50″ on the diagonal, but the height and width of each is different.
As aspect ratio increases, height decreases, and width increases.
On a 50″ 4:3 screen (which are no longer being produced), the reporter takes up most of the screen area. The screen height is 30″, and in that, the reporter is about 21″ tall. However, we see very little of the background.
On a 50″ 16:9 screen (720p, 1080p, UHD, 8K), the screen is nearly 6″ shorter – about 2 feet tall, in which the reporter appears only about 16″. However, we can see about 30% more of the background, compared to 4:3.
On a 50″ 21:9 screen, the screen is only about 20″ tall, in which the reporter is only about 13″, but we can see almost twice as much of the background compared to 4:3.
To Wide Screen or NOT Too Wide, Screen
4:3 isn’t really an option, so you really have two options: 16:9 and 21:9.
Again, let’s consider three TVs, all are 50″, and each is a different aspect ratio, but all three cost the same $500.
The 4:3 TV actually has the most viewing area, at 1,200 square inches, which ends up costing about $0.42 per square inch. However, the 21:9 TV, for the same amount of money, has 25% less viewing area at only 904 square inches, and costs about 31% more per square inch.
Let’s look at a more concrete example by comparing a 32″ QHD 16:9 monitor to a 34″ WQHD 21:9 monitor. At first, it looks like you’re getting more if you buy the 34″ 21:9 monitor – it’s two more inches diagonally, but it only costs $34 more, so it’s a good deal, right?
Wrong. Actually, it’s almost 15% shorter, which means 15% smaller text, it has around 4% less viewing area, and costs nearly 17% more per square inch.
If we instead compare the 34″ QHD (21:9) monitor to one with the same vertical height, but a 16:9 ratio, we find that a 27″ QHD fits perfectly, and some interesting facts emerge.
They share the same height (about 13″), but the WQHD is about 31% wider. And, the cost is about 31% more.
You might THINK you’re buying a “wide, 34 inch” monitor, but you’re really buying a stretched 27″ monitor.
Likewise, let’s say you want the “widescreen version” of the 32″ QHD (16:9) monitor. This would be comparable to a 40″ WQHD (21:9) monitor, which doesn’t exist. Instead, I’ve ball-parked the cost at about $400 based on the cost of other 21:9 monitor sizes.
Here, you get the nice, 15″ vertical size (larger text), but a 31% wider viewing area.
If you need more viewing area than a single 16:9 monitor can provide, a wide (ultra-wide) screen isn’t the best option. Instead, just buy two monitors.
So, unless you have a special use case that requires a wide (or ultra-wide) screen, stick to 16:9 for both TVs and monitors.
The computer on which I’m writing this has two Samsung 24″ 1080p monitors. For my needs, this is completely adequate, and two years ago, I paid less than $300 each for them.
…are a gimmick. Stay away from them.
The “curve” only helps you at very specific viewing distances.
- If you are inside the focal point of the curve, you actually have to turn your head to see the entire monitor. 8 or more hours of this will wear you out, and cause all sorts of strain and pain on your eyes, neck, and shoulders.
- If you are outside the focal point of the curve, you end up viewing the edges at an extreme angle, making it harder to see. You end up having to constantly lean to see the whole monitor, which is even worse than constantly having to turn your head.
There is one exception: A TV or monitor with a very slight curve can prevent glare, because you will only see reflected light from a curved monitor at the focal point of the curve. Flat monitors reflect glare evenly, so if you have a light source behind you (such as a window), a curved monitor can help in that regard.
Otherwise, they are an expensive gimmick.
Go Look At It in the Store
Buying online is easy and convenient, but you don’t always know what you’re buying. If you have your mind made up, or even if you don’t, here are some good reasons to go look at one on the showroom floor.
- Make sure the picture looks as good as you expect. Picture quality varies widely between manufacturers. Just because there are two QHD monitors at similar prices, does NOT mean that the picture quality is the same, NOR does it mean that the more expensive monitor has the better picture quality. Go look at it. Play with the display settings. Make sure you get a sense of what you’re buying.
- If there is some kind of quality issue, like a part that is subject to breaking, or a display that tends to separate, it’s going to happen to the demo model, which runs all day, every day, with thousands of people man-handling it.
Model Numbers – One thing to be aware of
All of the major retailers work with the manufacturers to ensure that none of them carry the exact same model number.
This is done on purpose to avoid having to price-match against another large retailer who might put your TV on sale.
Pick a specific brand, model, and size of TV, and the next time you’re in Best Buy, Wal-Mart, or Target, look at the model numbers. They are usually off by one letter. For example, if a hypothetical 60″ TV is a GG60QHN, then Wal-Mart will have a GG60QHN-K, and Target might have a GG60QHN-Y.
- Know your alphabet. You don’t have to memorize every acronym, but understand the standard options and terms for size, resolution, and inputs.
- Watch out for “Blast Processing” (an old Sega marketing gimmick) and other nonsense words that are made up by the manufacturer’s marketing department. There have been no new REVOLUTIONARY developments in display technology for almost 2 decades.
- Know what kinds of inputs you need, and how to use them.
- Understand smart features and platforms.
- Size drives cost. You should be paying (as of this writing) no more than about $9 per diagonal inch, excluding large screen sizes and special features.
- Resolution also drives cost. Stick to UHD (4K, 2160p) for TVs, and QHD / UHD for monitors.
- Watch out for aspect ratio gimmicks, such as wide screen formats and curved displays.
- Go look at it in the store before you buy.
If you’re curious about how some of the numbers were calculated, read on. If math makes you ill, stop here.
Calculate Aspect Ratio
You’ll see aspect ratio notated as two numbers, such as “4:3” or “16:9”. The first number is the aspect width, and the second is the aspect height.
To find the ratio, simply divide:
aw : ah
a = aw / ah
Common aspect ratios:
Given the Diagonal, Calculate Screen Size
First, calculate the aspect ratio (a).
a – Aspect Ratio
d – Diagonal Size, in inches
Compute the height:
h = √( d2 / ( a2 + 1 ) )
Compute the width:
w = h • a
How does this work?
Start with the assumption:
d2 = w2 + h2
We also know that:
w = h • a
d2 = h2 • a2 + h2
d2 = h2 • ( a2 + 1)
Solve for h:
h2 = d2 / ( a2 + 1 )
h = √( d2 / ( a2 + 1 ) )
|Given values v1 and v2
p = (v2 – v1) / v1
This tells you how v2 relates to v1 as a percentage of v1.
For example, if v1 = 4 and v2 = 5, then p = (5 – 4 = 1) / 4 = 25% increase.
Likewise, if v2 is less than v1, then the percent will be negative. If v1 = 4 and v2 = 3, p = (3 – 4 = -1) / 4 = -25%, representing a decrease.