How to get the origin window of a touch event

While working on a game engine framework, and also working through the SDL
headers and updating the wiki, I noticed that SDL_TouchFingerEvent no
longer indicates the windowId, but does give a touchId, which as far as I
can tell doesn’t give me any information on the window it originated from.
I’m pretty sure it’s necessary for any multi-window application to have
this information since the finger events only give a normalized set of
coordinates.

Is there a function for this that I missed, or is this something that
should be addressed?

The touches are normalized to the device. If you assume the device
represents the screen, then you can get the rect that represents the
screen, and do a global to local transform for each of the windows.

Yes, this is kind of a pain. Yes, it’s really how the touch devices work.

How are you trying to handle touch events in applications with multiple
windows?On Tue, Apr 2, 2013 at 7:32 PM, Alex Barry <alex.barry at gmail.com> wrote:

While working on a game engine framework, and also working through the SDL
headers and updating the wiki, I noticed that SDL_TouchFingerEvent no
longer indicates the windowId, but does give a touchId, which as far as I
can tell doesn’t give me any information on the window it originated from.
I’m pretty sure it’s necessary for any multi-window application to have
this information since the finger events only give a normalized set of
coordinates.

Is there a function for this that I missed, or is this something that
should be addressed?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

The problem is that it’s perfectly possible that the touch device
isn’t the one where the windows are displayed. For all we know, the
touch device may not even be a display (it could be just a touch pad,
for instance).

I guess the issue is that if the main screen(s) (where the windows
appear) are touch screen, you’ll want all events to happen relative to
the windows. This is where things get tricky, since you’d need to
guess where it happened, and window coordinates are relative to the
desktop, while touch coordinates are relative to the device. Oops.

Is there a way to be able to deal with both situations without much trouble?

2013/4/3, Sam Lantinga :> The touches are normalized to the device. If you assume the device

represents the screen, then you can get the rect that represents the
screen, and do a global to local transform for each of the windows.

Yes, this is kind of a pain. Yes, it’s really how the touch devices work.

How are you trying to handle touch events in applications with multiple
windows?

On Tue, Apr 2, 2013 at 7:32 PM, Alex Barry <alex.barry at gmail.com> wrote:

While working on a game engine framework, and also working through the
SDL
headers and updating the wiki, I noticed that SDL_TouchFingerEvent no
longer indicates the windowId, but does give a touchId, which as far as I
can tell doesn’t give me any information on the window it originated
from.
I’m pretty sure it’s necessary for any multi-window application to have
this information since the finger events only give a normalized set of
coordinates.

Is there a function for this that I missed, or is this something that
should be addressed?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

The touches are normalized to the device. If you assume the device
represents the screen, then you can get the rect that represents the
screen, and do a global to local transform for each of the windows.

Yes, this is kind of a pain. Yes, it’s really how the touch devices work.

Any way for us to know if the touch device is a screen?

How are you trying to handle touch events in applications with multiple
windows?

Right now, I’m not simply because I wasn’t sure how, and I don’t actually
have a touch device to test on. I was just in the process of implementing
it in hopes that it will be useable in the future. That’s when I saw that
the documentation was unclear.On Wed, Apr 3, 2013 at 12:58 AM, Sam Lantinga wrote:

Remember that knowing if the touch device is a screen or not isn’t
enough, you need to know if said screen is displaying the window, and
on which position relative to it. But yeah, that’d be probably nice.

Do OSes allow doing something like that?

2013/4/3, Alex Barry <alex.barry at gmail.com>:> On Wed, Apr 3, 2013 at 12:58 AM, Sam Lantinga wrote:

The touches are normalized to the device. If you assume the device
represents the screen, then you can get the rect that represents the
screen, and do a global to local transform for each of the windows.

Yes, this is kind of a pain. Yes, it’s really how the touch devices
work.

Any way for us to know if the touch device is a screen?

How are you trying to handle touch events in applications with multiple
windows?

Right now, I’m not simply because I wasn’t sure how, and I don’t actually
have a touch device to test on. I was just in the process of implementing
it in hopes that it will be useable in the future. That’s when I saw that
the documentation was unclear.

2013/4/3 Sik the hedgehog <sik.the.hedgehog at gmail.com>

Remember that knowing if the touch device is a screen or not isn’t
enough, you need to know if said screen is displaying the window, and
on which position relative to it. But yeah, that’d be probably nice.

Do OSes allow doing something like that?

2013/4/3, Alex Barry <alex.barry at gmail.com>:

The touches are normalized to the device. If you assume the device
represents the screen, then you can get the rect that represents the
screen, and do a global to local transform for each of the windows.

Yes, this is kind of a pain. Yes, it’s really how the touch devices
work.

Any way for us to know if the touch device is a screen?

How are you trying to handle touch events in applications with multiple
windows?

Right now, I’m not simply because I wasn’t sure how, and I don’t actually
have a touch device to test on. I was just in the process of
implementing
it in hopes that it will be useable in the future. That’s when I saw
that
the documentation was unclear.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

As an anecdote, I once worked on a X11 system with one touch screen (a
monitor with a touch sensor glued on top) and a regular monitor attached,
configured in a Xinerama combination. The Linux touch driver wasn’t that
advanced, and reported touches scaled to the whole Xinerama combo (so, 0.0
meant 0, but 1.0 meant the end of both monitors combined), and you had to
work around this by figuring out how the monitors where set up and re scale
the touch coordinates.
In short, dealing effectively with a multimonitor+touch situation may be
beyond SDL posibilities, specially on Linux where drivers sometimes are an
after thought for manufacturers.
There’s also other scenarios that come to mind…how do those Watcom
drawing pads behave? They are touch devices in essence. Or looking further
into the future, how would we handle Android devices when they inevitably
start having the option of not cloning the main screen to the HDMI ouput,
but rather have it as a separate output?

My gut feeling is that there should be a way to determine if the touch
device is “attached” to a display, if it is, you are advised to map the
0…1 coordinates into that display, if not (as in the case of a Watcom
pad), you are free to do what you want with those coordinates (0…1 could
be mapped to your window size)> > On Wed, Apr 3, 2013 at 12:58 AM, Sam Lantinga wrote:

Gabriel.

Wacom tablets are a whole different beast than usual. Yes, they’re
touch devices, and they mirror what’s on the monitor, but they’re also
analog (in the sense that you can detect how much pressure is there,
not just where). Also you have to factor the pen into the equation
since that’s how touch works on them, no idea if touch works without
the pen.

The pen also has several buttons on them. I think they behave as extra
mouse buttons, although I don’t remember from which number (8 onwards,
I believe?).

2013/4/3, Gabriel Jacobo :> 2013/4/3 Sik the hedgehog <@Sik_the_hedgehog>

Remember that knowing if the touch device is a screen or not isn’t
enough, you need to know if said screen is displaying the window, and
on which position relative to it. But yeah, that’d be probably nice.

Do OSes allow doing something like that?

2013/4/3, Alex Barry <alex.barry at gmail.com>:

On Wed, Apr 3, 2013 at 12:58 AM, Sam Lantinga wrote:

The touches are normalized to the device. If you assume the device
represents the screen, then you can get the rect that represents the
screen, and do a global to local transform for each of the windows.

Yes, this is kind of a pain. Yes, it’s really how the touch devices
work.

Any way for us to know if the touch device is a screen?

How are you trying to handle touch events in applications with
multiple
windows?

Right now, I’m not simply because I wasn’t sure how, and I don’t
actually
have a touch device to test on. I was just in the process of
implementing
it in hopes that it will be useable in the future. That’s when I saw
that
the documentation was unclear.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

As an anecdote, I once worked on a X11 system with one touch screen (a
monitor with a touch sensor glued on top) and a regular monitor attached,
configured in a Xinerama combination. The Linux touch driver wasn’t that
advanced, and reported touches scaled to the whole Xinerama combo (so, 0.0
meant 0, but 1.0 meant the end of both monitors combined), and you had to
work around this by figuring out how the monitors where set up and re scale
the touch coordinates.
In short, dealing effectively with a multimonitor+touch situation may be
beyond SDL posibilities, specially on Linux where drivers sometimes are an
after thought for manufacturers.
There’s also other scenarios that come to mind…how do those Watcom
drawing pads behave? They are touch devices in essence. Or looking further
into the future, how would we handle Android devices when they inevitably
start having the option of not cloning the main screen to the HDMI ouput,
but rather have it as a separate output?

My gut feeling is that there should be a way to determine if the touch
device is “attached” to a display, if it is, you are advised to map the
0…1 coordinates into that display, if not (as in the case of a Watcom
pad), you are free to do what you want with those coordinates (0…1 could
be mapped to your window size)

Gabriel.

Quoth Sik the hedgehog <sik.the.hedgehog at gmail.com>, on 2013-04-03 11:36:35 -0300:

Wacom tablets are a whole different beast than usual. Yes, they’re
touch devices, and they mirror what’s on the monitor, but they’re also
analog (in the sense that you can detect how much pressure is there,
not just where). Also you have to factor the pen into the equation
since that’s how touch works on them, no idea if touch works without
the pen.

It depends an awful lot on the device. My Bamboo tablet advertises
multitouch input plus pen rather heavily in the packaging, though
I’m not sure both are supported simultaneously. And some drawing
tablets of similar types do have displays as well that can mirror the
primary monitor. And there’s pen tilt sensors. I don’t recall the
exact set of possibilities just now, but it’s pretty wide.

The pen also has several buttons on them. I think they behave as extra
mouse buttons, although I don’t remember from which number (8 onwards,
I believe?).

It’s reconfigurable depending on the device and on the OS. And pens
can have multiple tips or ends, which have different IDs so they can
be distinguished by graphics programs to activate different tools.
And sometimes they can be put into relative versus absolute mode.
And…

I doubt it makes sense for SDL to try to handle every such case, but
picking a consistently bounded subset is fairly important. Designing
UI for spatially disjoint touch input can be substantially different
from designing it for display-attached touch, among other things—the
set of target configurations with which any particular application
is going to be practically usable is going to be pretty narrow unless
a lot of work is put in by the application developers.

That said, I loosely agree with Gabriel as far as being able to query
the association of touch devices and windows with shared 2D geometry
spaces (displays, separate pads, etc.), but I don’t know whether it’s
practical in all the targeted environments.

—> Drake Wilson

So, to summarize all the possible issues, and prioritizing them, we have:

  • We need to be able to definitely determine if a touch device is a
    display, or a separate device
  • If it’s a display, do the normalized coordinates match the focused
    window, or the whole display?
  • If it’s a separate device, are coordinates normalized on the focused
    window, or a single display, or all the displays in some sort of
    combination?

Is it safe to assume that the touch events were built only with mobile
devices in mind? If so, for the sake of release, should we limit the touch
API visibility/usability to mobile devices?
If not, is there a consistent way, on all supported OSes, to determine if a
touch device is also a display (or even associated with a specific
display)? It appears like MSDN may have something relevant
herehttp://msdn.microsoft.com/en-ca/library/windows/desktop/dd317318(v=vs.85).aspx.
Not sure about
Xorghttp://www.x.org/wiki/Development/Documentation/Multitouch#Event_processing,
since all the API links are broken on that page. I have no experience with
Mac, so I’m not sure what sort of documentation to look for.On Wed, Apr 3, 2013 at 11:16 AM, Drake Wilson wrote:

Quoth Sik the hedgehog <sik.the.hedgehog at gmail.com>, on 2013-04-03
11:36:35 -0300:

Wacom tablets are a whole different beast than usual. Yes, they’re
touch devices, and they mirror what’s on the monitor, but they’re also
analog (in the sense that you can detect how much pressure is there,
not just where). Also you have to factor the pen into the equation
since that’s how touch works on them, no idea if touch works without
the pen.

It depends an awful lot on the device. My Bamboo tablet advertises
multitouch input plus pen rather heavily in the packaging, though
I’m not sure both are supported simultaneously. And some drawing
tablets of similar types do have displays as well that can mirror the
primary monitor. And there’s pen tilt sensors. I don’t recall the
exact set of possibilities just now, but it’s pretty wide.

The pen also has several buttons on them. I think they behave as extra
mouse buttons, although I don’t remember from which number (8 onwards,
I believe?).

It’s reconfigurable depending on the device and on the OS. And pens
can have multiple tips or ends, which have different IDs so they can
be distinguished by graphics programs to activate different tools.
And sometimes they can be put into relative versus absolute mode.
And…

I doubt it makes sense for SDL to try to handle every such case, but
picking a consistently bounded subset is fairly important. Designing
UI for spatially disjoint touch input can be substantially different
from designing it for display-attached touch, among other things—the
set of target configurations with which any particular application
is going to be practically usable is going to be pretty narrow unless
a lot of work is put in by the application developers.

That said, I loosely agree with Gabriel as far as being able to query
the association of touch devices and windows with shared 2D geometry
spaces (displays, separate pads, etc.), but I don’t know whether it’s
practical in all the targeted environments.

—> Drake Wilson


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

How do desktop OSes handle touch events for starters? Maybe it’d be
easier to adapt the API if we considered how we’re expected to use
touch events there for starters.

2013/4/3, Alex Barry <alex.barry at gmail.com>:> So, to summarize all the possible issues, and prioritizing them, we have:

  • We need to be able to definitely determine if a touch device is a
    display, or a separate device
  • If it’s a display, do the normalized coordinates match the focused
    window, or the whole display?
  • If it’s a separate device, are coordinates normalized on the focused
    window, or a single display, or all the displays in some sort of
    combination?

Is it safe to assume that the touch events were built only with mobile
devices in mind? If so, for the sake of release, should we limit the touch
API visibility/usability to mobile devices?
If not, is there a consistent way, on all supported OSes, to determine if a
touch device is also a display (or even associated with a specific
display)? It appears like MSDN may have something relevant
herehttp://msdn.microsoft.com/en-ca/library/windows/desktop/dd317318(v=vs.85).aspx.
Not sure about
Xorghttp://www.x.org/wiki/Development/Documentation/Multitouch#Event_processing,
since all the API links are broken on that page. I have no experience with
Mac, so I’m not sure what sort of documentation to look for.

On Wed, Apr 3, 2013 at 11:16 AM, Drake Wilson wrote:

Quoth Sik the hedgehog <@Sik_the_hedgehog>, on 2013-04-03
11:36:35 -0300:

Wacom tablets are a whole different beast than usual. Yes, they’re
touch devices, and they mirror what’s on the monitor, but they’re also
analog (in the sense that you can detect how much pressure is there,
not just where). Also you have to factor the pen into the equation
since that’s how touch works on them, no idea if touch works without
the pen.

It depends an awful lot on the device. My Bamboo tablet advertises
multitouch input plus pen rather heavily in the packaging, though
I’m not sure both are supported simultaneously. And some drawing
tablets of similar types do have displays as well that can mirror the
primary monitor. And there’s pen tilt sensors. I don’t recall the
exact set of possibilities just now, but it’s pretty wide.

The pen also has several buttons on them. I think they behave as extra
mouse buttons, although I don’t remember from which number (8 onwards,
I believe?).

It’s reconfigurable depending on the device and on the OS. And pens
can have multiple tips or ends, which have different IDs so they can
be distinguished by graphics programs to activate different tools.
And sometimes they can be put into relative versus absolute mode.
And…

I doubt it makes sense for SDL to try to handle every such case, but
picking a consistently bounded subset is fairly important. Designing
UI for spatially disjoint touch input can be substantially different
from designing it for display-attached touch, among other things—the
set of target configurations with which any particular application
is going to be practically usable is going to be pretty narrow unless
a lot of work is put in by the application developers.

That said, I loosely agree with Gabriel as far as being able to query
the association of touch devices and windows with shared 2D geometry
spaces (displays, separate pads, etc.), but I don’t know whether it’s
practical in all the targeted environments.

—> Drake Wilson


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Message-ID: <20130403151633.GA2694 at quail.dasyatidae.internal>
Content-Type: text/plain; charset=us-ascii

Quoth Sik the hedgehog <sik.the.hedgehog at gmail.com>, on 2013-04-03 11:36:35
-0300:

Wacom tablets are a whole different beast than usual. Yes, they’re
touch devices, and they mirror what’s on the monitor, but they’re also
analog (in the sense that you can detect how much pressure is there,
not just where). Also you have to factor the pen into the equation
since that’s how touch works on them, no idea if touch works without
the pen.

It depends an awful lot on the device. My Bamboo tablet advertises
multitouch input plus pen rather heavily in the packaging, though
I’m not sure both are supported simultaneously. And some drawing
tablets of similar types do have displays as well that can mirror the
primary monitor. And there’s pen tilt sensors. I don’t recall the
exact set of possibilities just now, but it’s pretty wide.

I’ve looked at Wacom’s IC line before (they apparently tried to get
into the smartphone supplier market after Apple released the iPhone),
and here’s the basic gist of it:
Wacom pens use an oscillating magnetic field generated by the "touch"
device to obtain power, detect position, and signal various things
(such as the buttons). As such, they’re actually a 3d input device,
though the z axis is presumably constrained.

One of the things they mention about this system of theirs is that it
can be used IN ADDITION to other technologies, such as true touch
interfaces, and screens. Thus, both are probably supported at the same
time.

Has a anyone tested with a Wacom pen+touch device? Did you get cursor
movement even when the pen wasn’t touching the device? Did you notice
the value of the pressure variable for that particular pen?> Date: Wed, 3 Apr 2013 10:16:33 -0500

From: Drake Wilson
To: SDL Development List
Subject: Re: [SDL] How to get the origin window of a touch event

Date: Wed, 3 Apr 2013 11:33:02 -0400
From: Alex Barry <alex.barry at gmail.com>
To: SDL Development List , Drake Wilson

Subject: Re: [SDL] How to get the origin window of a touch event
Message-ID:
<CAJSO58NfDM8RVcbSqXTPYbDi9AXosDDrbeP1=h2eWKx4YhDU1w at mail.gmail.com>
Content-Type: text/plain; charset=“iso-8859-1”

So, to summarize all the possible issues, and prioritizing them, we have:

  • We need to be able to definitely determine if a touch device is a
    display, or a separate device
  • If it’s a display, do the normalized coordinates match the focused
    window, or the whole display?
  • If it’s a separate device, are coordinates normalized on the focused
    window, or a single display, or all the displays in some sort of
    combination?

Is it safe to assume that the touch events were built only with mobile
devices in mind? If so, for the sake of release, should we limit the touch
API visibility/usability to mobile devices?
If not, is there a consistent way, on all supported OSes, to determine if a
touch device is also a display (or even associated with a specific
display)? It appears like MSDN may have something relevant
herehttp://msdn.microsoft.com/en-ca/library/windows/desktop/dd317318(v=vs.85).aspx.
Not sure about
Xorghttp://www.x.org/wiki/Development/Documentation/Multitouch#Event_processing,
since all the API links are broken on that page. I have no experience with
Mac, so I’m not sure what sort of documentation to look for.

Back around 2002 I saw a transparent touch device attached to a
screen. It was already apparently old at the time. I think that we
should bear in mind that there might be some touch stuff out there
that we can AT MOST recognize via device string, if at all. What I
think is ultimately needed from SDL2 for good touch support is as
follows:

  1. Functions to list touch devices (currently provided in SDL_Touch.h,
    and the touch event structure)
  2. Functions to list any screens that a touch device is associated
    with (there’s something called a “spark tablet” where the pen
    generates noise to tell the hardware where it is; one of these could
    be used with multiple monitors by just sticking “listeners” on
    multiple monitors, and supporting this capability would be
    straightforward).
  3. A function to find the “touch location range” of a device that
    corresponds with a particular monitor. This would be the hard part,
    since some OSes might not support it, and some touch devices might
    ignore it even if the OS does provide it. Perhaps a simple database
    library would be better for this part?
  4. And a function to get a string describing a touch device (e.g.
    “generic non-display touchpad 1”)

So, part of the needed support is already present, we just need an API
to discover knowable touch device/screen associations and a
"fool-proof" identification API. That will provide as much support for
touch devices as I think that SDL2 should itself provide. A database
of “touch device strings” and their capabilities would be nice, but I
think that such a thing should probably be in it’s own library.

We can’t be certain about every touch device (and I assume that SDL2
will sooner or later be maintained for Haiku, where I don’t think
they’re even worrying about touch yet! I’m subscribed to one of their
mailinglists, you see, so I’ve seen some of the relevant
conversations), but we can provide enough of an API for any
degradation of certainty to be graceful.

One of the arguments to SDL_AddTouch in SDL_touch.c looks relevant,
but I’m not certain where it gets called from.