This is great. I have an x61t, too.
Originally Posted by pibach
Seems like good way of accomplishing this sort of thing. I'll mention it in the documentation. As for defining actions consisting of multiple strokes, I might introduce a simple state-based model (since we will need a concept of state for application-dependent gestures anyway), but I somehow resist the idea of introducing full-fleged scripting capabilities.
Edit: I did define some macros successfully using xte command plus str option. In the above example just place in the command field: xte "str with kind regards". Works great!
That's odd, I'm don't see flickering in metacity.
Edit2: Regarding flickering: I am not using compiz but XFCE without compositing. When I switch on the built in composition, flickering is gone! (Unfortunately, I get the old scroll performance problems on complex Web pages again then)
The 'Standard' setting in Preferences/Appearance/Method... might work better for you, but it will prevent screen updates until the gesture is over.
Fixed. These weren't really crashes, I previously used the system() call, which would wait for the application to finish before continuing. Of course you can easily walk around this by appending '&' to your command if you don't want to switch to the development version.
* Regarding the crash I could pinpoint it: it occurs if I do a gesture associated to launch xvkbd. This allway crashes. Another gesture launching cellwriter only crashes from time to time. All other gestures I have defined work fine.
This is a great idea and I shouldn't give up on it too early, but it might not be possible: The X server is not aware of what widgets windows consist of, so there's no way to tell where the text input fields are.
* regarding the activation button: what about when placing the cursor into an input field, then around this cursor position all strokes activate on left click? This would be similar to MacOS inking.
EDIT: It looks like I might be able to abuse the XDND drag and drop protocol, which is based around the idea that the source application must grab the server in order track the pointer outside its window and in order to change the pointer, so the target application can't get any mouse events and has to rely on the source to provide it with the necessary information. X is a mess.
As for defining a "shift gesture" that causes the next character to be upper case, I would associate the command "touch /tmp/shift" with the shift gesture and then use the command "/path/to/key ..." to press the keys where key is the following shell script (untested):
if [ -f /tmp/shift ]
xte "keydown Shift_L" "key $1" "keyup Shift_L"
rm -f /tmp/shift
xte "key $1"
Okay, I've taken the plunge and allowed gestures to be initiated as long as we are receiving XInput events. This seems to work better than I expected, but the fundamental problem remains: If the server is grabbed (usually because an app is showing a menu), the app will still receive the click event.
when I write in the URL bar of Firefox, then the popup listbox steals focus.
I've attached another snapshot. Again, there've been lots of changes under the hood. If easystoke is acting weird, you should be able to fix things by pressing escape now (but unfortunately, that means that gestures sending an Escape key will not work for the time being).