This week was mostly spent working on getting better touch support in swc. It's still not perfect, but I believe I have squashed the last major bugs in my initial implementation, the bulk of which was written in the past couple of days. As a test program, I used my Wayland port of svkbd, which itself still needs a lot of work.
For now, the implementation is pretty simple: when a finger touches the screen, the compositor determines which view the point is in, and sends touch events to the client associated with that view, if the point is in a view. If the finger leaves the view, the point is considered "dead", and no more events for that point are sent to clients, even if there is a client under the finger. This obviously doesn't work windows if can be rearranged while a finger is down (events may be sent to the wrong view), so for now we just assume that the window arrangement is static while a finger is down. In some cases, it may be desirable to be able to drag and drop things between different windows (especially on a touch based system), so again, in the long run, it probably doesn't make sense to "kill" touch points after they leave their view. Most importantly however, compositor gestures don't fit into this model. The goal for now was just getting a simple, usable touch implementation.
To test this implementation, I added touch support to my svkbd Wayland port, which was pretty straightforward, as it already had a mouse pointer implementation. The only new part was adding support for multiple touch points, which was not too hard.
Here's a video demonstrating the fruits of this week's labor: