Huh. Touch events are weird, and some apps have secret purposes for them.
TLDR: Single-finger touch events seem to be read differently from trackpad taps, with hidden touch-scrolling features in certain apps.
I might simply be slow to the party on this one, but I noticed something kinda cool.
Bit of background: I got an Asus x202e recently (lovely, lovely piece of kit.) It has a touchscreen. The touchscreen is, of course, nigh-useless in desktop Ubuntu, and the x202e's is further useless, because it has a quirk of getting stuck in a left-click event and not receiving any more left-click events until it feels better about it. That said, I've been able to play with some touch functionality in Unity and Gnome. I admit that it has become my preferred method of pausing YouTube videos and of scrolling web pages in Chrome with a touch extension installed.
I knew that there was some touch stuff associated with Desktop Unity. Four-finger tap and drag gestures pull up the Launcher and Dash; the three-finger tap calls the much-loved Compiz "Love Handles" plugin (which is mercifully easy to change back to a middle-mouse click* now.) There's also a very finicky application switcher based on performing a very rapid and elaborate sequence of three-finger taps and swipes. ** I also knew that the trackpad and touchscreen read slightly differently. When I handed the three-finger tap over to Synaptics, the love-handles behavior still worked from the touchscreen, and two-finger tapping on the touchscreen doesn't right-click. (Duh, right? Synaptics isn't involved.)
The bit I found interesting: I didn't know that some core apps seem to respond differently using touch input. I noticed it first in Nautilus - tapping and dragging the background scrolls the window contents, with a special stretch-and-bounce behavior at the top and bottom of the list. Since Nautilus is a core Gnome app, I tried Web (neé Epiphany) and got the same behavior, so I thought perhaps it was a Gnome thing. But then I accidentally discovered that it works in Ubuntu Software Center, too.
Is this something everyone else already knew about? I had assumed that utouch was mostly just sending the same sorts of events along that multitouch trackpads did. I didn't realize that there was a separate set of events for touchscreens, or that there was development going on within the core apps to support them. I guess it makes sense as a part of making these apps touch-friendly when they're running under Ubuntu Touch, without the configuration and chrome that desktop Unity provides, but it seems to be a GTK3 thing, rather than an Ubuntu thing. (And GTK3 is supposed to support multi-touch, but I'm not really sure what that means.
So who's writing this stuff, and what does it do?
* UF compose windows don't support middle-click paste. Bad show, UF.
** Typing this topic led me to mess with Touchégg and rebuild Unity to let gestures pass through to it, which makes my trackpad better (yay Exposé gesture, I've missed you so!) and the touchscreen even worse - I never really know what tapping with a single finger on the touchscreen is going to do, now. But interestingly enough, even when it stops working as a click event, it works as expected for most of the controls inside Epiphany, Gnome, and USC, as well as the touch scrolling. Tapping and dragging in Gnome Terminal still selects text, too. All of that applies only if the window is selected, though.
EDIT: Should this be posted in Ubuntu, Linux, and OS Chat? Sorry if I misplaced it!
Last edited by CharlesA; June 17th, 2013 at 07:44 PM.
Reason: made small font normal size again - it was kinda hard to read
I know I shouldn't use tildes for decoration, but they always make me feel at home~