-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
EDIT: Implemented in #3876.
Goals
Unified Pointer Handling
We currently handle Touch completely separately from mouse events (WindowEvent::Cursor*), the initial idea was I assume that the type doesn't really fit all the information that touch wants to express.
This would make sense but comes with a big downside: if users don't explicitly handle the Touch event, their application just won't work on touch devices.
This issue will only grow if we want to introduce pen/stylus support (#99). So to address this, we take inspiration from the Web Pointer Events Specification, which was mainly introduced to address this problem. The idea is to have a single event for all pointer input, which contains details about all possible inputs, be it touch, pen/stylus or anything else. That way users don't have to explicitly handle certain input types, but can still choose to do so, without accidentally not supporting a certain one.
Addresses #1555 (doesn't fix it, because there is a performance component to this on Wayland that would not be addressed by this proposal).
Fixes #336.
Pen/Stylus Support
To support pen/stylus we run into the same problem as above: we can't introduce a separate event, otherwise applications will suddenly stop working when a pen is used and they don't explicitly handle it.
Additionally this will require adding much more data to our current WindowEvent::Cursor* events. So we should figure out an API that will allow us to add more pointer type support in the future while we are at it.
Addresses #99 (implementation is being done in #3810).
Position Data in Pointer Events
We are currently missing position data in all pointer events except WindowEvent::CursorMoved. This isn't just annoying to the user, because they have to handle two events to get the information they want, but it requires implementations to account for that as well and send "fake" WindowEvent::CursorMoved events to report the position to the users.
The solution should be quite simple: add position: PhysicalPosition<f64> to all pointer events.
Implementation
- Rename
DeviceEvent::MouseMotiontoDeviceEvent::PointerMotionto represent any pointer device. - Rename
WindowEvent::Cursor*toWindowEvent::Pointer*. "Cursor" is more commonly used when describing the visual representation of a "Pointer", which we would like to avoid when talking about touch or pen/stylus. - Add
position: PhysicalPosition<f64>toWindowEvent::PointerEntered/Left. - Rename
WindowEvent::MouseInputtoWindowEvent::PointerButton. - Fold
WindowEvent::Touchinto pointer events:TouchPhase::Start->PointerButtonwithElementState::Pressed&PointerEnteredTouchPhase::Moved->PointerMovedTouchPhase::Ended->PointerButtonwithElementState::Released&PointerLeftTouchPhase::Cancelled->PointerLeft
- Introduce
PointerType,PointerSourceandButtonSourceenums, which can hold information about any device type. ButtonSourceshould be convertible toMouseButton, so users can do general pointer input handling without understanding each device type.
Proposed API
pub enum DeviceEvent {
// renamed from `MouseMotion`
PointerMotion { ... },
...
}
pub enum WindowEvent {
// renamed from `CursorMoved`
PointerMoved {
device_id: DeviceId,
position: PhysicalPosition<f64>,
// new field
source: PointerSource,
},
// renamed from `CursorEntered`
PointerEntered {
device_id: DeviceId,
// new field
position: PhysicalPosition<f64>,
// new field
r#type: PointerType,
},
// renamed from `CursorLeft`
PointerLeft {
device_id: DeviceId,
// new field
position: PhysicalPosition<f64>,
// new field
r#type: PointerType,
},
// renamed from `MouseInput`
PointerButton {
device_id: DeviceId,
state: ElementState,
// new field
position: PhysicalPosition<f64>,
// changed type from `MouseButton`
button: ButtonSource,
},
// removed, folded into `Pointer*` events
// Touch(Touch),
...
}
// removed, folded into `PointerType/PointerSource::Touch`
// pub struct Touch { ... }
// new types from here on out
pub enum PointerType {
Mouse,
Touch,
Unknown,
}
pub enum PointerSource {
Mouse,
Touch { finger_id: FingerId, force: Force },
Unknown,
}
impl From<PointerSource> for PointerType { ... }
pub enum ButtonSource {
Mouse(MouseButton),
Touch { finger_id: FingerId, force: Force },
Unknown(u16),
}
impl ButtonSource {
fn mouse_button(self) -> MouseButton;
}Alternative Changes
Keep WindowEvent::Touch
We could additionally emit the Touch event, and introduce a new event for each device type, for user convenience; having a dedicated place to handle a specific device type is certainly a nice touch.
Though it would lead to a duplication of events, which personally I'm not a big fan of.
Remove position from WindowEvent::PointerLeft
While this is certainly useful, I don't know if it makes on all backends, it certainly works on Web.
We could either wrap it in Option, if this information is not available on all backends, or remove it entirely.
Add WindowEvent::PointerChange event
Pointer devices like touch or pen/stylus, can update their details without actually moving. In the current proposal this would use PointerMove, but we could introduce a new PointerChange event to address this.
However, like other pointer events, we would probably still want to include a position field, so the data would look exactly like PointerMove.
Add KeyType
Maybe "Key" is not the best term used here, but we might want to use a similar device type distinction for keyboard input as well. This would probably include renaming the KeyboardInput event to KeyInput. The question is really what device types would show up here.
My understand is that devices like Joysticks, Steering Wheels and things like that would all fall under gamepads, which we already decided can be fully implemented by external libraries without any help by Winit.
The reason this has come up here is because of Waylands tablet protocol, which exposes a very strange amount of graphic tablet specific input events. However AFAIK this is very Wayland specific and no other backend exposes this amount of specific information. No matter how we want to address it though, it should be at least routed to KeyboardInput for the same reason we want to route all pointer events to Pointer* events.
Gestures
The idea would be to implement a new Gesture event with an enum containing all possible gestures. This is only related to this proposal because we might want to add PointerType/PointerSource to it (or a variation of it).
We currently have the following gestures:
DoubleTapGesturePanGesturePinchGestureRotationGesture
And more in the coming like #3413. Additionally I would also argue that we could fit MouseWheel and pen/stylus "actions" into this as well (#3768).
I think this would make it much easier to add new gestures and centralize a place to handle them. Adding PointerType/PointerSource to it would also allow users to identify what kind of device this is coming from, which brings me to the next topic.
Touchpad
This is related to gestures and #2379. The idea is that we need more information on touch events if they relate to a touchpad and not a touchscreen. This could easily be addressed by the current proposal by adding a Touchpad variant to PointerType/PointerSource (where we could also expose the touchpad coordinates).
I don't know how this would look implementation-wise on other backends, but Web has an open issue for that: #2963.
FAQ
How would touch events work?
PointerEntered- *
PointerButtonwithElementState::Pressed. PointerMoved- *
PointerButtonwithElementState::Released(when not cancelled) PointerLeft
*: Only emitted when it makes sense, e.g. touch screen. Touchpad doesn't emit those unless actually pressed.
This is exactly like using a mouse, fulfilling the purpose of this proposal
The only difference is that all events come with PointerSource/PointerType::Touch, exposing the FingerId and Force.
How would pen/stylus input work?
Exactly the same as touch events. But coming with PointerSource/PointerType::Pen/Eraser/Airbrush, exposing any relevant information.
How would pen/stylus hover work without touching the screen?
Exactly like touch events, but not emitting PointerButton unless the pen/stylus actually makes contact.
This would also be exactly how touchpads work.
How would changes to e.g. tilt in pen/stylus be represented?
By using the PointerMoved event. See the "Add WindowEvent::PointerChange event" alternative.
How would graphic tablet buttons be represented?
See the "Add KeyType" alternative.
In any case, they have to through regular KeyboardInput unless we want to repeat the same issue that motivates this proposal: if users don't explicitly handle it, the app won't react at all. This would sometimes make sense when certain actions are not translatable to e.g. keyboard input, but often they are.