Light Emitting Diodes (LEDs) are tiny sources of light that we may not acknowledge on a day-to-day basis. However, LEDs serve a critical function in the 21st century. From bus stop signs to TVs and now smartphones, almost all major digital displays incorporate LEDs. And like all other technology, LEDs have today advanced beyond recognition. There are now multiple variants of the original, from Organic LED (OLED) to Active-Matrix Organic LED (AMOLED) and now MicroLED. So many options can be good for competition, but by and large, most manufacturers in the commercial sector (TVs, smartphones, desktops and laptops) seem to agree that there are only two ways forward- MicroLED or OLED. To understand why these two are the leading contenders, we need to first understand LEDs and what makes them imperfect in their current form.
Traditional LED TVs from the early 2000s aren’t really LED displays. They are simply Liquid Crystal Displays (LCD) that use LEDs as a source of backlight. This layer of backlight allows LCDs to create an image that we see on our screens. This causes two major issues. One, that the extra layer of LEDs adds to the thickness and weight of the display. Which is fine for a 47” TV, not so much for a 6” smartphone or 13” laptop. Second, any damage to the display would cause a need to change both the LCD and the LED backlight, so added expenses. The LEDs were not always arranged at the back of the display. Most TVs preferred to place the backlight on the sides. This gave rise to another issue – screen uniformity. The LEDs were visible to the naked eye as a faint white shadow. This also LED to brighter edges and a duller centre, leading to imbalances in colour and black levels across the display. While not everyone would have ever noticed these issues, people who send long hours in front of a screen for editing did. The issues impacted not just present performance, but also made the technology unviable for long-term service.
To combat these issues, and reduce the thickness of displays. OLEDs came into the picture. In OLEDs, each pixel on the screen can light up when voltage is applied to it (i.e. each pixel is its own light source). This means OLEDs can produce more colour accurate displays, and a more dynamic range of colours (hence the term High Dynamic Range or HDR). Since each OLED can light up individually, there’s no backlight needed, and as a result blacks are blacker. OLEDs are available as thin flexible panels, so flexible that they can be easily folded without getting damaged. OLEDs offer far more flexibility over the screen, turning off unwanted pixels and can react faster to change (refresh rates are at 90Hz). That’s where the advantages end though.
Their limited refresh rates may be ok for an HD film, but if you are gaming, editing a 4K film or just watching a fast-paced game of football, 90Hz doesn’t really do the trick. OLED panels are also very expensive, and they tend to degrade quickly over time. They are also power-hungry, consuming more power than LCD displays. It’s a major reason why modern smartphone batteries don’t last as long (nor do the displays). The promise of a cheaper, faster and better future ultimately never emerged with OLEDs.
To combat the pitfalls of OLED, MicroLEDs have emerged as a viable alternative. While it is similar to OLED, MicroLED does offer some benefits. These displays are created by using microscopic LEDs to form an individual pixel. By doing so, each microLED lights up only when needed, and can be programmed to light up for a specific colour, just light OLED. MicroLEDs use inorganic materials to light up, so they don’t need a backlight. This means that colours are more accurate, and blacks are blacker. There’s also no light bleeding like with LED displays. MicroLEDs can also achieve much higher brightness and contrast as compared to OLEDs. The use of inorganic compounds means that the display has a longer life span as well, allowing each LED to glow brighter for longer. And to add to all this, MicroLEDs are proven to consume less power than an OLED display.
So, what’s the future look like?
Right now, OLED displays dominate consumer tech. Since they are mass-produced, they are cheap enough for everyone to use. Their flexibility and thinness also makes them great for everything from a folding phone to a wearable. OLEDs are also sufficient for most of our current needs. Even Apple, which is known for using only the best technology, has stuck to LCD for its iPhone 11 and OLED for the Pro Display XDR. To the average consumer, there’s hardly any difference between an ultra HD display and say a 4K one. Unless you are editing photos and videos, say in a major Hollywood studio, you really won’t know or care about how good a display is. The only factor that matters is screen brightness, as we tend to use a lot of electronics out in the sun. This is where most companies have hit a roadblock. Current displays can only go so bright.
The Consumer Market
Apple, and almost all other companies are trying to push the boundaries of current LCD and OLED technology by going to 4K (4,000 pixels horizontally) and even 6K (6,000 pixels horizontally). These technologies aren’t viable for everything though. You won’t find a 4K display on an iPhone anytime soon. 4K and 6K displays are mainly meant for TVs and monitors, where there is space to squash so many OLEDs in. Science hasn’t yet found a way to shrink OLEDs, so 4K displays and the like just aren’t adaptable for portable devices yet.
There’s also a rising demand for better quality displays in the commercial sector. Currently, 4K, 5K and 6K displays are limited to industrial use- film studios, animation studios, photo studios etc. But there’s a growing interest in gaming. PC gaming is making a comeback, and esports is fuelling a rise in demand for hi-quality displays. Gamers require not just brighter displays, but ones with higher refresh rates. Since the human eye has the equivalent to 500Hz refresh rate, gamers demand displays that can keep up with their reflex abilities. In competitive gaming, it could be the difference between life and death. Currently, the best PC displays go up to 240Hz, but this is far from the standard. OLED and LCDs just can’t reach that high.
So while the first half of the decade will see OLED displays retain their superiority, the economies of scale and a rising demand will ultimately see the industry migrate towards MicroLEDs. Currently, MicroLEDs are far from cheap. Since there were only commercially launched in 2018, not a lot of companies have adopted the technology in their commercial products. Once they do, demand will lead to a reduction in price allowing smaller companies to adopt it as well. Adoption rates will be fuelled by the rise of esports and competitive gaming. There are expected to be 250 million esports enthusiasts by 2021, according to a report by Newzoo. If even half of these are players, that’s over 100 million displays needed. And the number will continue to rise. Hi-quality displays will also see a growth in demand from TV consumers, as services continue to provide content in 4K and HDR. AppleTV+ now streams in HDR/Dolby Vison as standard (on supported hardware), and services like Netflix are increasing their catalogue of 4K titles.
As consumption moves from the TV to the laptop and mobile, manufacturers will have to keep up with the demand. Especially if Wi-Fi 6 and 5G streaming become mainstream. Phone manufacturers will also have to migrate towards MicroLEDs for their cameras. As smartphone cameras become increasingly powerful, there will be a demand for hi-res displays that can reflect the photo quality. You can now shoot in 4K on an iPhone, but what’s the point if you can’t view it as you shot it? All these factors will see MicroLED dominate and surpass OLEDs, the only question is- when?
Article by Srivats Lakshman