The GSoC folks got together with Ryan and me today to talk about the SDL 1.3
touch input API and how it relates to manymouse support and the Nintendo DS
and iPhone ports.
Tablets and similar devices will use the current mouse API, which will be
extended with pressure, tilt and rotation information. The iPhone will use
some new API for gestures to be designed later.
The full transcript follows, for your information:
[10:08am] slouken: Okay, we’re all here. I’ll do quick introductions
[10:08am] slouken: slouken = Sam Lantinga, SDL guy
[10:08am] slouken: icculus = Ryan Gordon, SDL guy
[10:08am] slouken: bobbens = Edgar Simo, working on SDL haptic code (force feedback)
[10:09am] slouken: hfutrell = Holmes Futrell, working on SDL iPhone port
[10:09am] slouken: lifning = Darren Alton, working on SDL Nintendo DS port
[10:10am] slouken: wilku = Szymon Wilczek, working on multiple mouse / touch input support
[10:11am] slouken: We’re here to talk about the touch input API, since it looks like we definitely need to formalize one for the NDS and iPhone ports.
[10:12am] slouken: Okay, let’s start off round robin, Szymon, why don’t you talk about what you’ve thought about in terms of the API. Everyone wait and Darren and Holmes can bring the needs of their platforms up after Szymon finishes his thoughts.
[10:13am] wilku_: well, actually I think it’s not anything marvelous. The most important aspects of touch input was adding the third coordinate to the mouse device
[10:14am] wilku_: generally systems recognize tablets also as pointing devices and we are using that fact
[10:14am] wilku_: the second most important thing was the relative mode, that is currently implemented by program so it should work not depending on the system
[10:15am] wilku_: I think that those are the most important elements
[10:16am] wilku_: for the touch input api, because pressure is also just a coordinate
[10:16am] slouken: So you’re thinking about simply adding mouse “Z” and we’re done?
[10:17am] wilku_: well actually I focused here on tablet users, so for me the z coordinate solves a lot of problems actually
[10:18am] slouken: Okay guys, does this suffice for your needs?
[10:21am] lifning: well, the NDS is indeed capable of detecting pressure. there’s a somewhat popular homebrew drawing program called “Colors!” that uses the pressure to weight the darkness of the line, etc.
[10:21am] lifning: problem is, it has to be calibrated
[10:21am] wilku_: what do you mean: calibrated?
[10:22am] [Relic]: number of pressure levels involved?
[10:23am] wilku_: from what I know: depends on the device
[10:23am] lifning: well, there are two sets of (x,y,z) given by a touch point in the DS
[10:24am] lifning: one is just the raw readings from the hardware, which have some variations over different machines
[10:25am] bobbens: don’t some devices also have inclination and blob detection (blobs having size and form)?
[10:25am] ??? slouken nods
[10:25am] slouken: Some have tilt and rotation
[10:26am] lifning: and the other has X and Y set to the appropriate range pixel-wise to the resolution of the screen. the X and Y in this get the calibration infos from the DS firmware’s data, but the Z (pressure) calibration isn’t set by the firmware. that homebrew app I mentioned has you press with different levels of hardness the first time you run it
[10:26am] slouken: lifning, how does the DS represent multi-touch?
[10:27am] lifning: I suppose I could just hard-code values that are about average if it came down to it, but is this something that other touch devices have to deal with too?
[10:27am] wilku_: well my tablet device doesn’t read tilt and rotation, but those are additional coordinates that are passed by the OS: well I don’t know how it looks on windows, but on linux those are just additonal axises passed by the X11
[10:28am] lifning: slouken: it doesn’t single-touch device. there are ways of emulating two-touch if you make the user promise not to move his first coordinates
[10:28am] lifning: but they’re… not really worth bothering with
[10:29am] wilku_: well to the calibration option
[10:30am] wilku_: currently to emulate a relative mode for tablets we are reading the screen coordinates and we remember it in the SDL_mouse file . So we could do the same for the z coordinates, and calibrate according to that
[10:31am] [Relic]: why not set emulated defaults for each device, then allow a calibration tool, normally to come up on the first play
[10:31am] lifning: there’s a good idea
[10:32am] wilku_: it’s an option
[10:32am] lifning: you mean like having the defaults set when the subsystem is init’d, and allowing the application to “SetTouchCalibration(…)” at runtime?
[10:32am] ??? slouken nods
[10:33am] slouken: We could build defaults based on actual hardware, since the the DS has a limited set.
[10:33am] [Relic]: precisely
[10:33am] lifning: sounds good
[10:33am] wilku_: okey
[10:33am] slouken: lifning: What were you talking about on the maliing list regarding double-tap and quickly changing coordinates?
[10:36am] lifning: well for double-tap… I was talking about ways of emulating a mouse using just a touch screen. the difficulty being that you can’t have coordinates change unless you’re “clicking” the device
[10:37am] lifning: and, let me check my email to see what the heck I was going on about coordinates quickly moving…
[10:37am] lifning: oh right
[10:37am] lifning: I kinda briefly mentioned this before
[10:38am] lifning: if you press the touch screen in two different places, it registers as one “point,” at the midpoint of those two locations
[10:38am] slouken: Gotcha, for that you can defer click until you either get motion outside of a tolerance or a release and then generate the appropriate mouse events
[10:38am] markuskj left the chat room.
[10:38am] slouken: So you get new point without intervening motion?
[10:38am] lifning: and when you let go of one of those touches, the point jumps to the remaining touch.
[10:39am] lifning: but i think that was just someone asking about the behavior when you press the single-touch device more than once
[10:39am] slouken: That sounds fine, just synthesize motion prior to click/motion processing at new point, yes?
[10:39am] lifning: yea
[10:39am] ??? slouken nods
[10:40am] wilku_: okey, so here
[10:40am] wilku_: is my question
[10:40am] slouken: Do you think the deferred processing will work for the tablet “mouse” mode?
[10:40am] wilku_: should the emulation of a double click by changing pressure should be implemented in SDL or in a user program?
[10:41am] icculus: do we have a concept of “double clicking” in 1.3?
[10:41am] wilku_: what do you understand by tablet mouse mode?
[10:41am] slouken: Are we even doing that?
[10:42am] slouken: tablet mouse mode = nds specific mouse model
[10:42am] icculus: we don’t even have single-clicking.
[10:42am] wilku_: I think that a doubleclick by change of pressure should be implemented by a user that needs that
[10:42am] wilku_: in his program
[10:42am] slouken: yep
[10:43am] wilku_: oh…
[10:44am] slouken: So, we’re good on concept of adding “Z” to mouse for pressure (pending more ideas), and NDS mouse mode emulation?
[10:45am] lifning: mhm
[10:45am] wilku_: well
[10:45am] wilku_: I don’t want to be misunderstood
[10:45am] slouken: wilku_: the NDS stuff isn’t something you’d work on, just to clarify.
[10:46am] wilku_: I think that the whole touch input api isn’t made by the z coordinate
[10:46am] wilku_: but regarding tablet users
[10:46am] wilku_: which I can represent
[10:46am] slouken: wilku_: I know, I’m just simplifying for my next question
[10:46am] wilku_: I can’t think of anything more than the relative mode
[10:47am] wilku_: oh okey
[10:47am] slouken: Here’s a larger question I’d like everyone’s input on, icculus, bobbens, hfutrell… Do we want to unify the input system similar to what was proposed on the mailing list and was done by DirectInput?
[10:47am] slouken: e.g. mice and keyboards are special cases of HID style devices?
[10:48am] icculus: I’m trying to dig up the email, but my instinct is “no”
[10:48am] bobbens: i don’t really seen an issue with the current setup
[10:48am] bobbens: but haven’t read the email
[10:48am] bobbens: looking for it now
[10:49am] wilku_: I think that wouldn’t be reasonable, the devices have their own properties and particular behaviour under particular OS’s
[10:49am] wilku_: I suppose it is better it stays the way it is
[10:49am] hfutrell: I have to agree
[10:50am] slouken: There are two reasons I’m asking. First, there are currently issues with mice that have additional “joystick” capabilities and other devices that we need to simulate mouse mode for. Second, it would map more closely to HID and DirectInput
[10:50am] hfutrell: I’ve never thought about the current system as a problem, and I’d imagine unifying things might actually make the system less simple, rather than simpler
[10:50am] ??? slouken nods
[10:50am] wilku_: I agree
[10:52am] wilku_: so I have a question, is there anything particular anyone would want in the touch input api: For me the most important were the z coordinate and the relative mode emulation?
[10:52am] slouken: So consensus is to keep the existing system… keyboard = keyboard, mouse = x/y axis + N buttons + mousewheel events, joysticks = arbitrary set of axes and other input thingies?
[10:53am] wilku_: mouse=x/y/z
[10:53am] bobbens: yes, and if a mouse thingy has joystick capabalities you should be able to list it under joystick and mouse
[10:53am] wilku_: a tablet is a device reported as a mouse with an additional coordinate
[10:54am] slouken: We already have the concept of relative mode mice, so that’s okay…
[10:54am] bobbens: well if you add z to mice, it would just make “normal” mice have z = constant = max.
[10:55am] slouken: Whatever we do for tablets should be able to handle tilt and rotation. I’m not sure that tablets should shoehorn into the mouse API.
[10:55am] wilku_: Well, I don;t know how it is under other OSs but under Linux normal mouses report 0 as the z coordinate
[10:55am] icculus: I’d assume tilt and rotation are just another field in the current mouse event.
[10:55am] Speeder: But tilt and rotation are usefull to games?
[10:55am] icculus: (when we’re talking about “Z” … that’s just tilt, I assume.)
[10:55am] Speeder: Tablets with tilt and rotation are not even cheap…
[10:55am] wilku_: yup, z is for pressure
[10:56am] bobbens: Speeder: but they will be someday, need to be future-proof
[10:56am] icculus: (er…or pressure, you know. all of those things are just data that defaults to 0 on 2D mice.)
[10:56am] wilku_: and it would be nice if SDL were more often used for other programs than just games
[10:56am] Speeder: I would use tilt and rotation as extra fields (like xyz + w + u) and then do internal code that maps the “real” thing to those extra fields
[10:57am] Speeder: It would be simpler to use, and I think that is not hard to do
[10:57am] icculus: wilku_: once you’re comfortable with the tablet support, we’re probably going to end up with patches to TuxPaint to use it immediately.
[10:57am] slouken: icculus: which is why I see the line between mice and other input devices blurring. Do we make mice have all those things or make those things separate device types, or do we merge them?
[10:58am] wilku_: rotation would be another field, but I currently am not worrying about that, I suppose it would be very easy to add it in the future
[10:58am] icculus: slouken: I’d say those all look like the same abstract input device. But things like Joysticks and Keyboards and other HID things become…a little harder to shoehorn in.
[10:58am] wilku_: I wouldn’t merge them
[10:58am] Speeder: slouken I would make all those “pointer devices” one, including trackballs, tablets, mouses, lightguns…
[10:58am] lifning: well, for making input devices unified,
[10:58am] icculus: slouken: working at the HID level, you find yourself with a lot of code that says “we know this is input, and it’s a desktop-style device, but is it this usage page, or that one?”
[10:59am] icculus: the notion of this pure abstractness sort of goes out the window in practice when working with HID.
[11:00am] wilku_: Speeder: yeah, they would all be reported as mouses, at least under X11
[11:00am] lifning: I can see it being reasonable. I mean a keyboard is a bunch of buttons, no axes. a mouse is a few buttons, one axis. a joystick is several buttons, several axes. a touchscreen is N axes and N “buttons”
[11:00am] icculus: I just got an IM from a lurker saying “please make sure this works with SDL in any case: http://www.thewretched.org/trackir.html …”
[11:00am] icculus: fwiw
[11:00am] slouken: I’m not saying it’s a good idea, I’m just not sure if adding relative mode and pressure/tilt/rotation axes to mice is either. Once you have thise you can represent an awful lot of other devices too.
[11:00am] lifning: so if you really really wanted to, you could model them all as arrays of buttons and coordinates
[11:01am] lifning: but I don’t know if that would make things a headache to deal with for the application programmer, or not
[11:01am] slouken: it would
[11:02am] hfutrell: yeah, it would be miserable
[11:02am] wilku_: okey, from my side of the screen, for me it is most reasonable to leave the tablets as mouses
[11:02am] icculus: slouken: I would think we should add some fields to the existing SDL mouse events and not go further for now. That “feels” like the right level of abstraction to me. Trying to get more abstract really gets dangerously close to Second System Syndrome.
[11:03am] slouken: I agree.
[11:03am] wilku_: cause, no matter how you look at it, a tablet being a joystick is against intuition
[11:03am] ??? slouken nods
[11:04am] wilku_: so I would add the next cooridnate angle and maybe one more for tilt
[11:04am] wilku_: I’ll have to acquire a better tablet that supports all that goodies
[11:05am] wilku_: but I think that implementation of those additional variable shouldn’t take too long
[11:05am] wilku_: any one against;)?
[11:06am] icculus: wilku_: no objection here
[11:06am] Speeder: I think that Joystick is Joystick… Keyboard and other key devices ar key devices (like car gear changer) and pointers are pointers (mouses, tablets, lightguns…)
[11:06am] wilku_: yeah
[11:06am] wilku_: I totally agree
[11:08am] wilku_: should we discuss implementation specifics here?
[11:08am] slouken: Okay, we can add Z, but it shouldn’t be tilt or anything
[11:09am] slouken: I think any attributes we add should be precisely named with defined meanings so application developers can use them effectively
[11:09am] slouken: Instead of (this Z, is it actual up/down motion or pressure?)
[11:09am] wilku_: so you would like me to change the z coordinate to pressure
[11:09am] wilku_: pressure
[11:10am] icculus: Are there 3D mice that actually lift off the desktop?
[11:10am] wilku_: but a change of pressure is notified as a mouse motion event
[11:10am] wilku_: no idea
[11:10am] lifning: that’s what I did in my example touch API in my branch
[11:10am] lifning: x,y,pressure
[11:10am] icculus: Call it pressure, then.
[11:10am] lifning: rather than z
[11:10am] wilku_: okey, I’ll change z to pressure
[11:10am] wilku_: okey
[11:10am] lifning: future-proofing for a 3D mouse someday
[11:10am] slouken: I have the 3D Connexion sitting here, with x/y, lift up and down (not off the desk) and rotation.
[11:10am] icculus: lifning: yeah
[11:10am] slouken: It also has tilt. grin
[11:11am] bobbens: it’s missing force feedback
[11:11am] slouken: I can send it to anyone who’s working on this stuff.
[11:11am] wilku_: and I have no idea how tilt and rotation is notified so I’ll get a better tablet and implement that in the ending phase
[11:11am] icculus: pressure makes more sense for the immediate need, we’ll add a Z that works independently of pressure…
[11:11am] icculus: …when Sam mails me that mouse.
[11:11am] ??? slouken grins
[11:12am] zakk: http://www.gyration.com/c-2-mice-keyboards.aspx
[11:12am] slouken: The company has offered to send more if needed, so I’ll be happy to send it to you Ryan
[11:12am] icculus: slouken: sure, free hardware is a great motivator.
[11:12am] slouken: wilku_: I don’t know if it will work for you, since there’s no XInput driver for it, as far as I know.
[11:14am] icculus: what other issues do we need to cover here?
[11:14am] icculus: What do we do for iPhones that can have more than one finger? Did we talk about this yet?
[11:14am] hfutrell: yeah, we still need to talk about this!
[11:17am] wilku_: so what’s with the multitouch thingy?
[11:17am] hfutrell: anyway, for the iPhone what we really need is a unique index associated with every touch event so that touch down, touch moved, and touch up can all be associated coherently with the same gesture
[11:18am] icculus: I guess the first question here is: how many fingers can the iPhone recognize at once?
[11:18am] icculus: If I put all ten on there, can it track them?
[11:18am] hfutrell: in my experience, 5
[11:18am] slouken: Wow.
[11:18am] hfutrell: but this is totally undocumented
[11:18am] hfutrell: I’ve just noticed that it errors above 5
[11:19am] icculus: but it wouldn’t make sense to just report five available mice on the iPhone, right?
[11:19am] hfutrell: this is what I’m doing right now, actually
[11:19am] icculus: ah
[11:20am] wilku_: hardcore
[11:20am] hfutrell: there’s two ways it doesn’t really make sense
[11:20am] slouken: I think long term we don’t want to do that though. It feels like a completely different beast from mouse, tablet, and joystick.
[11:21am] hfutrell: one is the whole not having a definite location when no finger is touching the screen
[11:21am] hfutrell: the other is that it doesn’t make sense to re-use a mouse index for a completely different touch gesture
[11:21am] slouken: (but one we’ll see more of, I’m sure)
[11:21am] Speeder: I’ve already used a touchscreen that supported a “infinite” number of fingers
[11:21am] lifning: so I guess it’s like hot-plugging / removing mice at runtime
[11:21am] wilku_: couldn’t you do something like proximity_out or proximity_in events
[11:21am] lifning: if that made any sense at all
[11:22am] hfutrell: lifning: that’s exactly what it’s like
[11:22am] Speeder: To test it, several people tried to put all their fingers no the screen, we filled about 60% of the surface (it was a somewhat big screen, I think that it was 30 inches…) with fingers and it still tracked them all…
[11:22am] wilku_: you think of them as 5 mouses and all are proximity out, when you put down a finger you fill the next axis
[11:23am] wilku_: and give them proximity in
[11:23am] hfutrell: you could do proximity in/out events … does SDL have these at the moment?
[11:23am] Speeder: just set the Z = MIN when no finger is touching the thing?
[11:23am] wilku_: it has now
[11:23am] wilku_: I had to implement it
[11:24am] hfutrell: okay
[11:24am] slouken: I didn’t see that, is that different from the mouse focus events?
[11:24am] wilku_: I think it is
[11:25am] wilku_: because a tablet reports proximity out when the pen leaves the sensing zone
[11:25am] wilku_: of the tablet
[11:25am] wilku_: and proximity in when it shows in the sensing zone
[11:25am] ??? slouken nods
[11:25am] wilku_: and I added an additional boolean field: proximity to the mouse device, that it remembers its state
[11:26am] wilku_: it was essential to relative mode
[11:26am] slouken: Okay
[11:26am] wilku_: if a mouse can’t report proximity events it is set default to proximity in
[11:26am] slouken: Holmes, what’s the device interface for the touch screen like? do you only have the gesture interface, or something lower? Anything under NDA you can whisper to me, as I’m also in the developer program.
[11:26am] hibread joined the chat room.
[11:27am] slouken: wilku_: that sounds good
[11:27am] wilku_: the code has to be updated, because I have to fill the type in SDL_mouse, the other way SDL won’t send the event
[11:27am] wilku_: so I’m filling the type of the event bymyself
[11:28am] wilku_: but I think that’s something to be fixed easily by someone who know the events implementation better than me
[11:28am] ??? slouken nods
[11:29am] wilku_: so I hope that helps you with multitouch problem
[11:32am] wilku_: emmm, some weird silence…
[11:32am] slouken: Holmes was just telling me that the API for the iPhone screen uses an intuitive gesture interface that’s really unlike anything else we support.
[11:33am] slouken: I think we may add a new type of input for it, and Holmes can implement a “mouse emulation” mode similar to what Darren is doing for NDS
[11:33am] hfutrell: that does sound best
[11:33am] hfutrell: it really does need a new API to be handled well
[11:33am] wilku_: I think that’s a good idea, I suppose it will help me implement the Wii pilot api in the future
[11:34am] slouken: Yeah, I know nothing about that. What is it similar to?
[11:34am] wilku_: The Wii pilot API?
[11:34am] slouken: Yes
[11:34am] zakk: I wonder how the sdl on the gp2x f200 works
[11:35am] zakk: since that has a touch screen
[11:35am] zakk: probably just mouse emulation
[11:35am] wilku_: Actually I have no idea, but it’s something I want to try in the future and I guess it’s gonna be similar to the iphone
[11:35am] ??? slouken nods
[11:36am] slouken: Okay, let me summarize what I think we’ve resolved…
[11:36am] tesmako: zakk: Hmm? But if you have no proximity or multitouch the mouse approach is all you need I were thinking?
[11:37am] zakk: tesmako: while it is most likely that simple, I don’t know if that is true, was hoping someone else in here had one since I just have the f100 (non touch)
[11:37am] tesmako: (well, possibly some flag to inform the app that it isn’t a mouse, to not show a pointer)
[11:37am] wilku_: zakk: I suppose it just reports pressure with absoulute coordinatres
[11:37am] wilku_: zakk: like a tablet in absolute coordinate mode
[11:37am] slouken: The existing mouse events are going to be extended with information needed for tablet support, e.g. proximity, pressure, tilt, maybe rotation. The mouse events continue to contain both absolute and relative motion.
[11:38am] slouken: The Nintendo DS is going to have mouse emulation implemented by Darren
[11:38am] slouken: The iPhone is going to have a gesture API TBD and also a mouse emulation implemented by Holmes
[11:39am] slouken: We should probably add some sort of query function so we can tell the capabilities of a “mouse” device.
[11:39am] slouken: Does that cover everything?
[11:39am] wilku_: at what you said. On what I’ve seen so far change of pressure is just reported as a mousemotion event even by a OS and I don’t see a need of changing that, so I suppose tilt and rotation would be reported the same way
[11:39am] hfutrell: I’ll also be adding a mouse emulation mode as well – it’s already in place, really
[11:40am] lifning: yeah, I think so. um, should it be assumed that if the touch API isn’t initialized, the mouse emulation should be used by default?
[11:40am] slouken: There is no touch API, it’s just the standard mouse API.
[11:40am] wilku_: Oh and I have a question: not fully regarding the topic
[11:40am] lifning: oh, I see
[11:41am] lifning: okay that makes a lot more sense
[11:41am] slouken: Is there anything that you can’t represent using what we’ve talked about?
[11:41am] wilku_: I noticed that the cursor functions aren’t working: hide show cursor or create cursor is it a bug made by me, or is it a general bug currently in SDL?
[11:41am] slouken: not yet implemented.
[11:41am] slouken: waiting for this stuff and the concept of a “core” cursor.
[11:41am] lifning: well it should be sufficient for NDS stuff
[11:42am] wilku_: and about the querying function. It would be quite problematic. Why?
[11:42am] wilku_: because the tablet is reported in X11 as a button device, Valuator device
[11:42am] wilku_: and I suppose a few other devices
[11:42am] slouken: Well, it might be useful for TuxPaint for example to be able to provide pressure options if the hardware supports it and grey them out if it doesn’t.
[11:43am] slouken: Okay, let’s talk a bit about API specifics. I need to go very soon, but I’ll get it started.
[11:43am] wilku_: grey them out : it’d simply set all coordinates to 0, that aren’t used
[11:44am] slouken: Do we want separate events (e.g. SDL_MOUSEPROXIMITY, SDL_MOUSEPRESSURE, SDL_MOUSETILT, etc. or do we just make them all motion events)?
[11:44am] wilku_: well
[11:44am] wilku_: I made two new events
[11:44am] wilku_: SDL_PROXIMITYIN and SDL_PROXIMITYOUT
[11:44am] wilku_: i see no need for the the other ones
[11:45am] wilku_: let’s report them simply as motion just like the OS does
[11:45am] slouken: They’re all just a different kind of motion, right?
[11:45am] slouken: Have you looked at the API on other platforms?
[11:45am] wilku_: I haven’t looked at it on windows, but I suppose it’s gonna be similar
[11:46am] ??? icculus is running on that assumption, too
[11:46am] wilku_: but if it occures different
[11:46am] wilku_: if other OSs report the differently
[11:46am] wilku_: what’s the problem in reporting MOTION by SDL then
[11:46am] slouken: Can you guys look into it?
[11:47am] icculus: yeah
[11:47am] slouken: I just want to make sure we’re not missing something.
[11:47am] icculus: wilku_: I’ll do that tonight
[11:47am] wilku_: well generally the most important functions for manymouse and touch input are implemented, so my next step will be the porting to windows, so I’ll have to do it soon
[11:48am] slouken: Okay… is there anything else we want to cover on this topic?
[11:49am] wilku_: no problem with me
[11:49am] icculus: sounds good here
[11:49am] lifning: sounds good
[11:49am] slouken: Okay, thanks guys!
[11:49am] hfutrell: sounds good here too