Message-ID: <20130403151633.GA2694 at quail.dasyatidae.internal>
Content-Type: text/plain; charset=us-ascii
Quoth Sik the hedgehog <sik.the.hedgehog at gmail.com>, on 2013-04-03 11:36:35
-0300:
Wacom tablets are a whole different beast than usual. Yes, they’re
touch devices, and they mirror what’s on the monitor, but they’re also
analog (in the sense that you can detect how much pressure is there,
not just where). Also you have to factor the pen into the equation
since that’s how touch works on them, no idea if touch works without
the pen.
It depends an awful lot on the device. My Bamboo tablet advertises
multitouch input plus pen rather heavily in the packaging, though
I’m not sure both are supported simultaneously. And some drawing
tablets of similar types do have displays as well that can mirror the
primary monitor. And there’s pen tilt sensors. I don’t recall the
exact set of possibilities just now, but it’s pretty wide.
I’ve looked at Wacom’s IC line before (they apparently tried to get
into the smartphone supplier market after Apple released the iPhone),
and here’s the basic gist of it:
Wacom pens use an oscillating magnetic field generated by the "touch"
device to obtain power, detect position, and signal various things
(such as the buttons). As such, they’re actually a 3d input device,
though the z axis is presumably constrained.
One of the things they mention about this system of theirs is that it
can be used IN ADDITION to other technologies, such as true touch
interfaces, and screens. Thus, both are probably supported at the same
time.
Has a anyone tested with a Wacom pen+touch device? Did you get cursor
movement even when the pen wasn’t touching the device? Did you notice
the value of the pressure variable for that particular pen?> Date: Wed, 3 Apr 2013 10:16:33 -0500
From: Drake Wilson
To: SDL Development List
Subject: Re: [SDL] How to get the origin window of a touch event
Date: Wed, 3 Apr 2013 11:33:02 -0400
From: Alex Barry <alex.barry at gmail.com>
To: SDL Development List , Drake Wilson
Subject: Re: [SDL] How to get the origin window of a touch event
Message-ID:
<CAJSO58NfDM8RVcbSqXTPYbDi9AXosDDrbeP1=h2eWKx4YhDU1w at mail.gmail.com>
Content-Type: text/plain; charset=“iso-8859-1”
So, to summarize all the possible issues, and prioritizing them, we have:
- We need to be able to definitely determine if a touch device is a
display, or a separate device
- If it’s a display, do the normalized coordinates match the focused
window, or the whole display?
- If it’s a separate device, are coordinates normalized on the focused
window, or a single display, or all the displays in some sort of
combination?
Is it safe to assume that the touch events were built only with mobile
devices in mind? If so, for the sake of release, should we limit the touch
API visibility/usability to mobile devices?
If not, is there a consistent way, on all supported OSes, to determine if a
touch device is also a display (or even associated with a specific
display)? It appears like MSDN may have something relevant
herehttp://msdn.microsoft.com/en-ca/library/windows/desktop/dd317318(v=vs.85).aspx.
Not sure about
Xorghttp://www.x.org/wiki/Development/Documentation/Multitouch#Event_processing,
since all the API links are broken on that page. I have no experience with
Mac, so I’m not sure what sort of documentation to look for.
Back around 2002 I saw a transparent touch device attached to a
screen. It was already apparently old at the time. I think that we
should bear in mind that there might be some touch stuff out there
that we can AT MOST recognize via device string, if at all. What I
think is ultimately needed from SDL2 for good touch support is as
follows:
- Functions to list touch devices (currently provided in SDL_Touch.h,
and the touch event structure)
- Functions to list any screens that a touch device is associated
with (there’s something called a “spark tablet” where the pen
generates noise to tell the hardware where it is; one of these could
be used with multiple monitors by just sticking “listeners” on
multiple monitors, and supporting this capability would be
straightforward).
- A function to find the “touch location range” of a device that
corresponds with a particular monitor. This would be the hard part,
since some OSes might not support it, and some touch devices might
ignore it even if the OS does provide it. Perhaps a simple database
library would be better for this part?
- And a function to get a string describing a touch device (e.g.
“generic non-display touchpad 1”)
So, part of the needed support is already present, we just need an API
to discover knowable touch device/screen associations and a
"fool-proof" identification API. That will provide as much support for
touch devices as I think that SDL2 should itself provide. A database
of “touch device strings” and their capabilities would be nice, but I
think that such a thing should probably be in it’s own library.
We can’t be certain about every touch device (and I assume that SDL2
will sooner or later be maintained for Haiku, where I don’t think
they’re even worrying about touch yet! I’m subscribed to one of their
mailinglists, you see, so I’ve seen some of the relevant
conversations), but we can provide enough of an API for any
degradation of certainty to be graceful.
One of the arguments to SDL_AddTouch in SDL_touch.c looks relevant,
but I’m not certain where it gets called from.