Mr. Picklesworth
August 25th, 2008, 07:03 AM
I have built a patch for Metacity that allows one to drag a window from any unused portion. This is a very simple change (in fact, a removal of what stops the behaviour in the first place! -- more on that later). It is like how Matchbox (aka. the window manager used in Maemo) behaves with dialogs set to Free position mode instead of Fixed.
Here is a mailing list discussion on it:
https://lists.ubuntu.com/archives/ubuntu-devel-discuss/2008-August/005330.html
Here is my branch on Launchpad:
https://code.launchpad.net/~dylanmccall/metacity/drag-from-anywhere
To test this, download the code from my branch on Launchpad. Build it by running sh autogen.sh then make. DO NOT INSTALL IT. Instead, test it by running "./src/metacity --replace" after building. (This way no long-term changes should occur).
Often, this is really superbly cool. For example, with the patch applied I can basically drag any simple GTK-powered window (like panel Properties) from anywhere that feels 'fixed' rather than detachable (in the sense of our visual metaphors). Technically, that is not expecting mouse input for other reasons. Where it works well, I think this gives the desktop a more cohesive, physical feel. That's exactly what we are heading towards with artistic design and with a lot of composited effects (like the wobbly stuff in Compiz). I like to think of windows as something like playing cards being pushed about, so it fits well with that little metaphor.
However, this falls apart with more complex applications and with some other widget toolkits. GTK toolbars and menubars, some containers (like the container for GNOME Appearance Preferences) simply cannot be clicked on to drag a window from within -- even though they intuitively should -- because they act like black holes for mouse events. Firefox doesn't work. OpenOffice doesn't work. (Midori, Epiphany, Abiword and Gnumeric do. *Hint, hint*).
This seems to have been something overlooked in the design of windowing systems and UI toolkits in general. Looks like as soon as a child widget is listening for button events, they'll never reach the parent even if the child gets said event and has nothing to do with it. (For example is disabled or the click is not on the target area, as in toolbars or menubars). Thankfully, the handlers are usually supposed to return 0s somewhere along there.
I can't help but wonder whether this is necessary...
My thought is it probably could be fixed, but could take a huge amount of digging. It also would require that a lot of bug reports be filed along the lines of "this application" or "this widget is sinking mouse events that it does nothing with". For this to work, it would have to kick up a lot of dust; the effect itself is consistency or nothing.
Hence this post. This is an arguably trivial thing interface-wise. Basically, you would be able to drag a window from its backing instead of just by its title bar, and this behaviour becomes consistent instead of being only in some places.
For example, MacOS and Windows Vista have this type of behaviour in some default apps. Usually just drag by the toolbar and the status bar, even though other parts of the window are just as stationary. It is not intuitive or predictable because it is implemented at the level of each individual application instead of the toolkit. The window manager itself should handle this, because the toolkit doing it is duplicating functionality that could be subject to change.
I think this will be important as we move towards more touch-based interfaces where clicking on a title bar is not feasible or intuitive with how a touch interface is usually designed. Do you think this is worth it to pursue? What do you think of the idea so far? Does it break anything?
Most importantly: Would you find it a problem if windows behaved the way I propose, or would you welcome the change?
Here is a mailing list discussion on it:
https://lists.ubuntu.com/archives/ubuntu-devel-discuss/2008-August/005330.html
Here is my branch on Launchpad:
https://code.launchpad.net/~dylanmccall/metacity/drag-from-anywhere
To test this, download the code from my branch on Launchpad. Build it by running sh autogen.sh then make. DO NOT INSTALL IT. Instead, test it by running "./src/metacity --replace" after building. (This way no long-term changes should occur).
Often, this is really superbly cool. For example, with the patch applied I can basically drag any simple GTK-powered window (like panel Properties) from anywhere that feels 'fixed' rather than detachable (in the sense of our visual metaphors). Technically, that is not expecting mouse input for other reasons. Where it works well, I think this gives the desktop a more cohesive, physical feel. That's exactly what we are heading towards with artistic design and with a lot of composited effects (like the wobbly stuff in Compiz). I like to think of windows as something like playing cards being pushed about, so it fits well with that little metaphor.
However, this falls apart with more complex applications and with some other widget toolkits. GTK toolbars and menubars, some containers (like the container for GNOME Appearance Preferences) simply cannot be clicked on to drag a window from within -- even though they intuitively should -- because they act like black holes for mouse events. Firefox doesn't work. OpenOffice doesn't work. (Midori, Epiphany, Abiword and Gnumeric do. *Hint, hint*).
This seems to have been something overlooked in the design of windowing systems and UI toolkits in general. Looks like as soon as a child widget is listening for button events, they'll never reach the parent even if the child gets said event and has nothing to do with it. (For example is disabled or the click is not on the target area, as in toolbars or menubars). Thankfully, the handlers are usually supposed to return 0s somewhere along there.
I can't help but wonder whether this is necessary...
My thought is it probably could be fixed, but could take a huge amount of digging. It also would require that a lot of bug reports be filed along the lines of "this application" or "this widget is sinking mouse events that it does nothing with". For this to work, it would have to kick up a lot of dust; the effect itself is consistency or nothing.
Hence this post. This is an arguably trivial thing interface-wise. Basically, you would be able to drag a window from its backing instead of just by its title bar, and this behaviour becomes consistent instead of being only in some places.
For example, MacOS and Windows Vista have this type of behaviour in some default apps. Usually just drag by the toolbar and the status bar, even though other parts of the window are just as stationary. It is not intuitive or predictable because it is implemented at the level of each individual application instead of the toolkit. The window manager itself should handle this, because the toolkit doing it is duplicating functionality that could be subject to change.
I think this will be important as we move towards more touch-based interfaces where clicking on a title bar is not feasible or intuitive with how a touch interface is usually designed. Do you think this is worth it to pursue? What do you think of the idea so far? Does it break anything?
Most importantly: Would you find it a problem if windows behaved the way I propose, or would you welcome the change?