Skip to content

Adding touchpad multitouch support #2379

@simon-frankau

Description

@simon-frankau

Hi.

I'm interested in adding touchpad multitouch support to winit, with an implementation for MacOS. I'm new to winit, so I'd be keen to get feedback on whether this is worth implementing before I start coding. I'll try to pre-empt a few questions here:

  • What is "touchpad multitouch support"? winit already supports multitouch, but (I believe) it assumes that you're touching the screen, so touches are in screen coordinates. This would be supporting touches where the coordinates are touchpad coordinates.
  • Why would this support be useful? I'm aware of Add touchpad magnify and rotate gestures support for macOS #2157 - touchpad gestures. This is another approach: Having the raw touches available allows extra flexibility, so that the library user can interpret a wider range of multitouch movements. For example, the touches can be interpreted to provide pinch-to-zoom with different scales on the different axes, or custom gestures.
  • So, couldn't similar functionality be provided by extending Add touchpad magnify and rotate gestures support for macOS #2157?. Yes-ish. Recognition for extra gestures could be coded into winit, but passing touches straight through to the user seems more flexible and potentially less complex for winit, if you were to support such gestures.
  • Is this instead of Add touchpad magnify and rotate gestures support for macOS #2157? No, this provides more flexibility for those that want it, enabling access to raw touches, as well as gestures. Gestures are still very useful.
  • Should winit supporting touchpad multitouch? Looking at the list in FEATURES.md, there is already support for a) multitouch and b) touchpad functionality. It feels to me like it's pretty much inside the boundaries defined by existing features, although I do have sympathy for "Yeah, but can we avoid scope creep on potentially marginal features?".
  • What would this look like, API-wise? I think the assumption is that Touch coordinates are screen coordinates. My suggestion would be to add an enum member to Touch for the coordinate system used - something like "screen" for touch screen, and "device" for trackpad. Maybe these events should be delivered as DeviceEvents rather than WindowEvents?
  • Why do you personally want this? While I hope this feature would be useful to other people, I'd like egui to support zooming by different amounts in the X and Y axes from the same gesture, which is not supported with the MacOS pinch gesture, but is supported with multitouch.
  • What platforms would you support? I only have MacOS to hand, so my implementation would only cover MacOS.

As I said at the start, I'm new to winit, so I may have made some incorrect assumptions, or be thinking about this incorrectly. Please do correct me.

Thanks,
Simon.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions