trackpad gestures for switching workspaces - Desktop Customization & Workflow
Users browsing this thread: 1 Guest(s)
|
|||
i have a thinkpad and use the trackpoint exclusively for mousing, so there's a big part of my wristrest that sits there and does nothing. i used to have a mac and i really liked having the ability to swipe between workspaces using the trackpad. a lot of *nix DEs have the ability to interpret gestures but with WMs, you're pretty much stuck configuring it yourself.
today i decided to get this working and it was a lot easier than i thought it would be. this applies to linux - i haven't tested it on anything else and i don't know if libinput even works on other OSes so ymmv. requirements: libinput-gestures xdotool (for non EWMH WMs) or wmctrl (for EWMH WMs) instructions: install libinput-gestures and xdotool but hold off on starting libinput-gestures. in your wm, configure keyboard shortcuts to switch workspaces/groups*. for example, in cwm, create bind-key lines for group-cycle and group-rcycle (probably works best with sticky groups). next, open the libinput-gestures config file and find the swipe left/right gestures and change the xdotool command to match the wm binding. finally, reload your wm config and start libinput-gestures. *if you can send a command directly to your wm, i think you can just replace the xdotool command with the workspace switching command and skip adding a keybind to your wm config. i haven't tested this though. this could be pretty useful in conjunction with wmutils!
-------
nvsbl.org |
|||
|
|||
I like the topic of gestures.
I've added some sort of gesture to my desktop, basically when putting the mouse in my top-right corner the time and date pops up. |
|||
|
|||
I've personally always used scrolling on my bar (always at the bottom across all screens) to switch workspaces. It's not a "gesture" as such, but it has the same effect, especially since I can just throw the cursor to the bottom of the screen and always end up over the bar.
The bar I'm currently using (swaybar) used to switch workspaces on every single scroll event, which made it useless with precise trackpads like the one in my laptop, but I got a patch merged to fix that: https://github.com/swaywm/sway/pull/5067 |
|||
|
|||
Very intresting! i also switched to a thinkpad from my mac, it would actually be quite usefull to switch between workspaces with the trackpoint and scrollbutton, will have to try this out.
|
|||
|
|||
Let's bring back "gestures". Do you use gestures on your machine? If yes, for what? Any other UX trick that you know is related?
|
|||
|
|||
(13-08-2020, 10:18 AM)mort Wrote: I've personally always used scrolling on my bar (always at the bottom across all screens) to switch workspaces. It's not a "gesture" as such, but it has the same effect, especially since I can just throw the cursor to the bottom of the screen and always end up over the bar. This is something I do too, but it's really just scrolling. The point of mouse gestures is, you don't have to reach a specific screen area to trigger the action. Mouse gesture was the killer feature of opera before they changed their internal renderer to adopt blink. It's one of the reasons why I'm still using vivaldi. When I'm writing text, I think a good UI should be comfortable to use with pure keyboard. When I'm rapidly switching tasks and tools, I think a pointer device is more comfortable, and thus I think a good UI should be comfortable with both keybard and pointer, in an "exclusive manner" (either pure keyboard, xor pure mouse) even if that's obviously ok if you manage to have something usable with both devices at the same time. I already thought about using mouse gestures for my window manager, but never did anything in that direction. Having an environment set for that would probably require much effort, since unlike keyboard, applications tend to "steal" all the 5 mouse' "buttons": right click, left click, middle click, wheel up, wheel down, and unlike for keyboard, that's usually not configurable. It would probably require specifically designed applications for that to work. Applications which would *not* be using Qt, Gtk, or anything based on those toolkits (because context menu and the like a probably deeply built-in, and it would require too many changes to get it correctly). Nowadays, for such an application to work, it would probably have to support both X11 and Wayland protocols. For X11, it seems doable: the thing is mature and complete. For wayland? I suppose it is very, very far from that point (that's the subjective feeling I have from each reading I do about wayland). |
|||
|
|||
The only touchpad gestures I use are those to simulate the mouse:
These are the commands I use to set this on OpenBSD: Code: synclient TapButton1=1 Unfortunately, I don't think there is a way to set proper gestures on OpenBSD (that is, binding gestures to commands). I thought on setting hot-corners (which are not touchpad gestures, but a mouse thing) to open the desktop menu, but then I remembered that I use multi monitors, so there is no way to the pointer be stopped by barriers on the corner of the monitor. |
|||