LightBlog

lundi 17 août 2015

How and Why Force Touch Can Revolutionize Smartphone Interfaces

20150817164623341

Most of us have probably heard about Apple’s “Force Touch” and “Taptic Engine“—both a software gesture and a technology—proudly featured on new MacBooks and Apple Watches and vaunted by Apple as “the most significant new sensing capability since multi-touch.” However, since this is primarily a website for Android (and Windows Phone) enthusiasts, Force Touch remains a curiosity for most of us—certainly not something we use on a day-to-day basis, and most probably something we’ve never experienced, at all. Apple has gone all-in on the new input method, though, and there are widespread rumors that Force Touch will be built into, and play a prominent role in, both iOS 9 and the next iPhone (probably iPhone 6S). And if Force Touch becomes a part of the iPhone brand, it’s probably reasonable to assume that that it’s only a matter of time until it—or a similar gimmick—makes its way to Android, either by individual OEM experimentation or by addition to the next Google Nexus device and Android release.

What is it?

“Force Touch” is really a suite of software features and hardware technologies that come together to create a completely new touch-experience for computers and devices. On the software side, the idea is hardly revolutionary—change the response of the software depending on how hard you press the touch pad. Apple has implemented this idea into OS X in various ways, including: adjusting the speed of fast-forward or rewind in Quicktime; looking up a word when you click on it extra-hard in Safari; or signing your name in Preview like you would with a Wacom digital pen. None of these features are particularly interesting—they’re mostly useful as shortcuts, adding extra convenience. But as is typical with Apple, it’s not necessarily the idea that creates a revolution, but the form and design—the execution of the idea.

It’s in the hardware implementation of Force Touch where the so-called “Apple magic” starts to happen, and once you really feel the new haptic feature, you begin to see why it could be “the most significant new sensing capability since multi-touch.” Most trackpads are simply planar sheets of plastic with physical or capacitive buttons—corresponding to the left and right buttons of a mouse—off to the top or bottom and capable of sensing touch at multiple points. Apple’s touch pads have typically been a little bit different. Rather than placing physical buttons on the edges, Apple chose to put the trackpad on a hinge, turning the entire trackpad into a single gigantic button—when you pressed down, the whole trackpad clicked. Apple then emulated right-clicks in the software by utilizing the multi-touch capabilities of the trackpad; one-finger clicks were left-clicks, while two-finger clicks became right-clicks. This approach had some drawbacks—namely, because the hinge was at the top, the farther up the trackpad your finger traveled, the more difficult it became to click.

Force Touch trackpad (courtesy Apple)

Apple Force Touch trackpad; a large, planar sheet of capacitive glass

The new Force Touch trackpad is something very different. The new trackpad is a large sheet of pressure-sensing, multi-touch, capacitive glass—it’s basically a smartphone touchscreen, except without the display. And like a smartphone screen, there are no buttons—none, at all. Instead, Apple has opted to utilize pressure and touch input to programmatically drive an array of electromagnets acting as actuators to simulate clicks. They’re using force feedback to trick your brain into thinking you’ve clicked a button—but it’s a lot more sophisticated than the force feedback in your video game controllers (well, except maybe the Steam Controller), and it feels a hell of a lot better than the little “tick” vibrations that most smartphones, today, respond with when you press a button on the on-screen keyboard. Why? How? Let me explain.

20150817165147487

Firstly, the pressure. There are several ways to sense pressure on a touch-surface, and your smartphone probably already implements at least one of them. One such way is to actually sense pressure—that is, how much downward force you’re applying to the screen. A simpler, cheaper way is to emulate pressure sensitivity by instead measuring how much surface area is being touched. Consider that when you tap on a screen, a small part of your finger comes into contact with the glass. When you really press down, however, you smush your finger into the glass, and a larger piece of the pad of your finger comes into contact with the capacitive surface. Software will then interpret that larger area of contact as “pressure.” Apple has chosen the former route—direct measurement. Underneath each of the four corners of the trackpad are small strainmeters that directly measure pressure. With one underneath each corner, Apple can tell two things:

  1. Approximately where the trackpad is being pressed by the difference in pressure at each of the corners (a moot point since the trackpad is a giant touchscreen), and
  2. How much total pressure is being applied to the trackpad by summing the pressure on each of the four sensors. So when you push on some area of the trackpad, the location of your touch is known and the pressure you’re applying is known.

With these two knowns, software determines when and where it should simulate a click by switching the “taptic engine”—an array of electromagnets mounted beneath the trackpad—on and off in rapid succession. iFixit does a great job demonstrating how toggling an electromagnet can induce vibrations and haptic feedback in the trackpad, while Wired does a good job discussing the importance of some old research on lateral-force haptic feedback at MIT in the 1990’s. In short, though, small, high-frequency horizontal movements under a flat surface can trick our brain into interpreting vertical movements and texture, and the “taptic engine” electromagnets tug and push on a steel bar mounted to the underside of the trackpad to induce small, high-frequency horizontal movements.

But these details don’t tell the entire story. There’s a reason why the tick-tock vibrations we feel when we press virtual buttons and keys on our smartphones aren’t impressive—they’re not localized. When your phone vibrates, the whole phone vibrates. And worse, you can sometimes even feel that the source of that vibration is coming from the top or bottom bezels of your phone, where the little vibrating motors are physically located. The sensation of pressing a key on your screen would feel a lot different if the vibration was actually centered right beneath your finger—this is exactly what Apple has done, and it’s why the Force Touch trackpad feels so real.

MacBook "Taptic Engine" (courtesy Apple)

The Taptic Engine is comprised of four independently-driven electromagnets simulating linear actuators

In July 2009, Apple filed a patent for a “method and apparatus for localization of haptic feedback.” In all the teardowns and explanations of Apple’s Taptic Engine—at least the ones I’ve seen—seemingly no one has asked why there needs to be four independently-driven electromagnets—why not just one big electromagnet? The answer to that is because a single vibration source can only produce simple mechanical waves that propagate outwards from their source; multiple vibration sources, however, can produce complex patterns of wave interference on the trackpad. This is essential for localizing the “click” sensation—you want to feel the “click” right beneath your finger, not off to the side of the trackpad or near a bezel. By using four independently-driven electromagnets, Apple has essentially created a phased array of simulated linear actuators (iMore.com thoroughly explains the difference between the spinning-weight vibrators most devices use and linear actuators, which you can sort of consider as a kind of electric piston—by the way, the Apple Watch’s Taptic Engine contains an actual linear actuator instead of a simulated one like the Force Touch trackpad, likely due to its small size).

This phased array can produce patterns of interfering mechanical waves on the trackpad such that there is constructive interference beneath your finger and deconstructive interference elsewhere. What Apple is doing, here, is essentially a sort of mechanical-wave beamforming. In the world of electromagnetic waves, 801.11ac routers use beamforming to produce constructive interference in a virtual bubble around a connected device, resulting in better signal and faster, more reliable connectivity. This is why 801.11ac routers have more than one antenna. But Apple isn’t doing this with electromagnetic waves—it’s doing it with mechanical waves, propagating through a trackpad. If you haven’t heard of beamforming, before, take this opportunity to watch the YouTube video below. It makes it easy to understand.


 

With the combination of a multi-touch glass surface, pressure-sensing strainmeters under each corner, a phased array of electromagnets acting as linear actuators, beamforming, and some old MIT research that shows that the human brain can be tricked into interpreting horizontal motions as textures and vertical motion, Apple has created a trackpad that can realistically simulate clicks anywhere on its surface. With this new Force Touch technology, Apple has implemented the Force Touch gesture, whereupon apps and the operating system can interpret firmer clicks separate from softer ones. And finally, Apple has chosen to provide this gesture with a second, “deeper” click as haptic feedback for a successful “Force Click.”

Why is it cool?

Ok, so the Force Touch technology is quite a complicated endeavor to go through just to add a gimmick to the operating system—pressure-sensitive shortcuts. *What’s the point?* Well, Apple certainly doesn’t think that Force Touch is a gimmick, and rumors suggest that it’ll be adding a suite of pressure sensitive features in iOS together with the expected iPhone 6S release (Force Touch a point of interest in Maps to skip straight to turn-by-turn navigation, for instance). But while extra haptic feedback doesn’t seem *that* exciting on a laptop, it could be a real game-changer for smartphones. Imagine if pressing buttons on your phone really felt like pressing buttons. Because of the way Force Touch works—localized haptic feedback anywhere on the screen—that could be potentially be a reality. But since the whole concept of Force Touch is relatively new, Apple is limited by its adoption rate. The Force Touch trackpad—and touchscreen, if Apple can successfully translate the tech to smartphones—has substantial potential if you only put your imagination to it.

20150817162942193

Imagine how Force Touch could enhance Material Design with textures.

Imagine that every phone, laptop, and desktop in existence had a Force Touch surface (whether it be a screen or a trackpad); no software is limited in its user interface design by consideration for legacy devices without Force Touch. What Apple has created, here, is a surface that can be arbitrarily controlled by software to vibrate precisely at high frequencies, to beam that vibration to specific points on the surface, and to simulate touch via combinations of localized vibrations. The future potential of such a technology is incredible, and dare I say, revolutionary. To start, one seemingly-silly thing you could do with a Force Touch surface is turn the whole thing into a rudimentary speaker. In fact, Valve’s engineers did exactly that with their haptic trackpads during the development of the Steam Controller. I can’t think of any particularly-useful reason to utilize a Force Touch surface as a speaker—since both smartphones and laptops typically come equipped with much better speakers—but dual haptic and aural feedback is certainly a potential avenue to pursue for future innovations. A more interesting thing you could do, however, is implement advanced haptic feedback throughout the entire user interface—not just in the context of clicking. The Wired article linked to earlier in this discussion actually delved a small bit into this topic as it pointed out something in a recent changelog of iMovie.

“When dragging a video clip to its maximum length, you’ll get feedback letting you know you’ve hit the end of the clip. Add a title and you’ll get feedback as the title snaps into position at the beginning or end of a clip. Subtle feedback is also provided with the alignment guides that appear in the Viewer when cropping clips.”

20150817164755349Think about some of the user interface innovations in Material Design: Google carefully crafted the Material Design standard to evoke the feeling of paper in three dimensions. Every drop shadow, color gradient, and outline was intentionally designed to communicate this concept. But imagine that, not only can you *see* the paper elements, but you could feel their edges. Imagine if, when you scroll to end of page of content on your phone, not only do you get that visual bump indicating that you’re reached the end, but you feel it, as well. And consider how amazing it would be to feel momentum and weightiness of cards as you swipe them out of your Recent Apps list. This may sound like nuclear fusion—always 10 years away—but it’s not. I don’t intend to turn this discussion into a gigantic fluff piece for Valve, but once again, the Steam Controller purports to do exactly this, and initial reviews have been positive:

“Feel the spin of a virtual trackball, the click of a scroll wheel, or the shot of a rifle. Every input, from the triggers to the trackpads, can offer haptic feedback to your fingertips, delivering vital, high-bandwidth, tactile feedback about speed, boundaries, thresholds, textures, or actions.”

The Steam Controller relies on exactly the same fundamental physics as Appls’s Force Touch—pressure-sensitive multi-touch trackpads underlain by an array of linear actuators. If Valve can do it, surely Apple and Google can do it, too.

Who Else is Working On It?

Well, besides Apple, there hasn’t been a peep out of neither Google nor the major OEMs about any similar technology or functionality. The rumor mill has been silent about anything similar, too. Apple has moved into such unknown territory that it could very well be that everyone is in a wait-and-see mode. There may be fundamental issues with trying to implement this tech in smartphones that those of us who aren’t mobile device engineers aren’t privy to. No one truly knows if Force Touch can be successfully implemented in smartphones and tablets; all of the rumors about Force Touch on the iPhone 6S are exactly that—rumors (in fact, the existence of the iPhone 6S itself is a rumor). Further, if Force Touch is never further developed by Apple, its future may very well be limited to that of a gimmick, only activating various shortcuts in iOS and generally being a curiosity rather than a revolution. There is no guarantee of a future where, on a Force Touch surface, we can “feel the spin of a virtual trackball, the click of a scroll wheel, or the shot of a rifle.”

Nevertheless, if the future does look promising for the new tech, there is one OEM who may be in a position to introduce a similar technology—Motorola. I previously discussed the Apple patent on “method and apparatus for localization of haptic feedback.” Apple filed that patent in July 2009. Nearly a year earlier, though, Motorola filed a patent for an “electronic device with localized haptic response.” Sound familiar? And whereas Apple’s patent primarily used a trackpad as an example, Motorola’s patent specifically uses a touchscreen cellphone with virtual buttons to demonstrate their idea.

And how do the two patents compare? Apple’s patent deals with the localization of haptic feedback by way of generating constructive interference of waves beneath the user’s finger and deconstructive interference elsewhere. As for Motorola, their patent mentions the use of an array of linear actuators to recreate the feeling of pressing a real button on a touchscreen. Though the patent is mum on how, exactly, they intend to use the array of actuators to achieve this, one could probably surmise that it’s probably much the same way as Apple is doing it, given they’re both working with the same hardware.

Conclusions

Force Touch has great potential to be a paradigm shift in how we touch and interact with our mobile devices, but it requires software implementation as innovative and sophisticated as the hardware to really shake up the industry. Advanced haptic feedback, driven by arrays of linear actuators, has the potential to simulate all manner of touch sensations ranging from the click of a mouse the edge of a virtual boundary. However, these are just potential capabilities of the technology—whether or not they’re actually designed and implemented is another matter, entirely. If Apple is content with only presenting Force Touch as gimmick—primarily being used for shortcuts and other insignificant features within iOS and OS X—there are ways to simulate that in Android. XDA member tkgktyk built an Xposed module, “Force Touch Detector,” that can detect touch pressure either by actual measurement or via the surface area method (depending on the measurement capabilities of your particular phone), and it’s even more customizable that Apple’s own implementation of Force Touch gestures in OS X.

At the end of the day, though, the lack of advanced localized haptic feedback that makes the Force Touch trackpad so remarkable make the Force Touch gesture unimpressive on all current smartphone hardware. And if Apple never brings anything to the table beyond one-click turn-by-turn navigation or looking up words in a dictionary in Safari with a Force Click, there will be little pressure on Android and Google to innovate and bring their own advanced haptic feedback to the table. Regardless of what happens with Force Touch in the near-term, the technology for advanced haptic feedback is both achievable and demonstrable, and it’s just a matter of time until somebody implements it well enough to show the public how amazing it can be. Though at present the idea of
feeling UI elements on a touchscreen seems like a future technology, the future may come up on us quicker than we expect.

 

What do you think about this upcoming technology? Will it be implemented as a gimmick, or will it shine? Tell us below!



from xda-developers » xda-developers | How and Why Force Touch Can Revolutionize Smartphone Interfaces http://ift.tt/1E0OED1
via IFTTT

Aucun commentaire:

Enregistrer un commentaire