Years ago, touchscreen technology was fictional as it could only be seen in movies and read in books. Significant technological advancements have emerged from the generations that have passed, especially for touchscreens, which now explain their presence in the daily lives of humans. The touchscreen technology can get traced back to the 1940s, but much evidence points out that touch screens were not possible until 1965.
In this whitepaper:
Touchscreen technology started back in 1965 but failed to become mainstream popular until 2007 when Apple released the first iPhone.
1965 - E.A. Johnson: First finger-driven touchscreen
The idea of the touchscreen interface was written and recorded in October 1965[1] when an engineer in Malvern, England, specifically at the Royal Radar Establishment, sparked his interest in creating and developing a touchscreen to aid traffic control. His name was Eric Arthur Johnson.
Eric Arthur Johnson wrote and published another more detailed and extensive paper about touchscreens. This paper explains how the touchscreen technology operated through pictures of prototypes and diagrams. Moreover, in 1969, E.A. Johnson was granted a patent[2] for his invention.
1970 - Dr. G. Samuel Hurst: A new type of sensor called the Elograph
While capacitive touchscreens were the first to get invented, resistive touchscreens surpassed them in the initial years of touch. Dr. G. Samuel Hurst developed resistive touchscreens nearly by accident. At the time, Hurst was conducting research at the University of Kentucky. The university attempted to seek a patent on his behalf to safeguard this unintentional discovery from replication. Still, its scientific roots made it appear as though its only value was in the lab.
Hurst began an after-hours investigation after returning to work at the Oak Ridge National Laboratory in 1970. He discovered that a conductive cover sheet was just the thing that the screen needed. This breakthrough paved the way for what we now know as resistive touch technology[3] which he and his team called Elographics. The group finally patented the first curved glass touch interface.
1984 - Bob Boie: First multitouch screen overlay
Nimish Mehta created the first human-controlled multitouch device at the University of Toronto in 1982. Myron Krueger, an American computer artist who built an optical system that could capture hand gestures, pioneered gesture interaction immediately after. Touchscreens became extensively commercialized in the early 1980s. When Bell Labs' Bob Boie created the first transparent multitouch screen interface[4], it significantly advanced multitouch technology. Moreover, this allowed users to alter visuals with their fingertips.
2007 - Apple: First iPhone multitouch to the masses
Apple was the first company to launch a touchscreen smartphone in 2007[5] successfully. Because the iPhone has a compact, user-friendly form and minimal multitouch functionality, users cannot hold the shift key with one finger while typing a capital letter with another in keyboard mode. However, it enables the pinching capability for zooming in and out of maps and pictures, invented by researcher Krueger. The first iPhone helped touchscreen technologies become popular worldwide.
Overview: Touchscreen Technologies
As touchscreen technology evolved, significant developments came with it.
1. Camera-based tracking
In camera-based tracking, a camera sensor is used for actually seeing the display surface, with AI algorithms as middleware evaluating the camera data and extracting touch point coordinates. Vast numbers of touch points can simultaneously be recognized using camera-based tracking. However, nowadays camera-based technology has almost entirely disappeared from the market with modern standards such as PCAP or IR Frames taking over.
Despite camera-based tracking almost entirely disappearing from the modern touchscreen market, it did offer a few benefits such as infinite touch points and fast response rates. It also does not require any additional layers on top of the screen.
However, theres a reason modern standards such as PCAP or IR Frames dominate the market today. With the space a camera requires for camera-based tracking, the technology cannot accommodate a flat screen which is not practical nowadays. While the SUR-40 could accommodate a flat screen it had notoriously low accuracy in a technology that already struggled with accuracy in ambient light settings.
Even with its drawbacks, camera-based tracking was used in a few large-scale consumer products. In 2007, the Microsoft Surface Table[6] included a display with rear projection and multitouch camera tracking. Shortly after, in 2009, eyefactive included a rear-projection modular display and multitouch camera tracking. Finally, in 2012, the Samsung SUR-40[7] featured an LCD Display with visual pixel-based tracking.
2. IR Touch-Frames
The IR Touch-Frames method involves adding a frame to a display that provides an IR light overlay and sensors. A touch point coordinate can be traced if a finger blocks the light. The number of touch points and precision can be increased depending on the sensors.
Initially, IR Touch-Frames struggled with precision. But, precision issues were worked out and they now provide high accuracy and about 20 touch points. Nowadays, IR Frames are the only way to add interactivity to video wall systems, the most common use for IR Frames today.
Although precision has improved compared to IR Touch-Frames of the past, dust remains an issue. When dust falls on the sensor, the sensor no longer works properly so they require consistent cleaning. Theyre also not appealing for touch tables.
3. Projected Capacitive (PCAP)
As the modern industry standard for touch technology, you can find Projected Capacitive technology in almost all of todays B2C products. But, its capabilities are not restricted to B2C products. It is also used in bigger touchscreen devices for interactive digital signage. PCAP uses thin wires as an extra layer behind the screen's glass top. They detect the finger's capacity, which affects the sensor's capacity, and can be monitored very precisely.
PCAP uses sensors embedded into the fastening glass. This allows for bonded screens that give the device a more visually appealing look. It can accommodate anywhere from two to 80 touch points.
With the wide use of PCAP, theres now a wide range of distinct variants of sensors and controllers available. So, a good market understanding is essential to discern between high-quality and low-quality products. PCAP sensors used to have very thick cables that were visible. Early versions do not provide a fantastic experience due to poor touch sensitivity, speed, and precision.
Nowadays, PCAP is used for the most predominant touch-screen devices on the market such as the iPhone. But, its uses are not only in small B2C products. Its also the main form of touch technology used in a wide range of industries such as museums, retail, gastronomy, corporate environments or control rooms.
4. From iPhone to McDonald's (B2C to B2B)
Touchscreen technology has adapted to a broader range of devices and applications since the original iPhone in 2007[5]. Although initially used primarily for personal use (B2C), a different use case grew around B2B applications.
Annually, the touch screen panel market evolves. It continues producing larger touchscreen panels at lower costs, making touchscreen technologies more viable for small businesses and large roll-outs.
McDonald's self-order systems have been the most well-known and biggest roll-out of its sort, with installations beginning in all countries in 2018. It uses the same type of technology as consumer touch screens except on a much larger scale. Installing self-service kiosks has been extremely successful showing a 30% increase in revenue[8] in Ireland and the UK. This is largely due to people customizing their orders more on a kiosk resulting in higher average order values.
5. From Single-Touch > to Multi-Touch > to Multi-User
The first touchscreen technology, single touch, merely emulates the mouse interface with a finger.
Currently, we use multitouch screens. Even if the type of gestures enabled by current smartphones and tablets are usually restricted to simple interactions, the first iPhone made multitouch technology renowned. Tap, slide, and drag are all possible with one finger. Pinch and rotate are for two fingers. On larger touchscreens, advanced multi-touch can employ two fingers or the entire hand.
Future developments in touchscreen technology are showing multi-user interaction. This technology uses large-scale touchscreens and enables multiple people to interact with the display at the same time. By enabling multi-person interactions, these new large-scale touchscreens offer opportunities for collaborative tasks. This new approach in software design can be regarded as an evolution of human-computer interaction.
6. Software for XXL Touchscreens
Even though a good touchscreen sensor is required, it is the software technology that handles everything. Apple and Google provide touch interaction development tools for their own operating systems for mobile devices, iOS and Android. The introduction of software platforms such as Google Play and Apple AppStore was also a major element in the success of consumer touchscreen devices.
Large-scale touchscreens with powerful multi-touch or even multi-user software capabilities necessitate an even more innovative and cutting-edge programming environment for software developers. It also requires some out-of-the-box thinking as multi-user software is an innovative approach to human-computer interaction.
Summary & Take-Aways
It has not been long since Apple's first iPhone introduced touchscreen technology to the public. It has progressed since then, quickly gaining traction in the B2B sector. McDonald's was the first company to introduce larger touchscreens in a B2B environment on a global scale in 2018.
Going forward, expect touchscreen technology to further permeate into a wide variety of businesses. For example, restaurants can start utilizing touchscreen tables allowing customers to order as soon as they are ready and provide entertainment while waiting. More retail stores will also start incorporating touchscreen kiosks similar to the way McDonalds uses them. But, they are not only limited to the gastronomy sector as businesses can use touchscreen kiosks as part of their omnichannel strategy.
As for technology, touchscreens are starting to expand in capabilities with pressure sensitivity. This can unlock a new method of interaction. Other touchscreen manufacturers are experimenting with semi-transparent touchscreens which can be used for Augmented Reality (AR) solutions.
More will come, transforming how we engage with computers in public areas, just as smartphone touchscreens did in the B2C sector.