Let's talk about device input in SDL 1.3

Last summer I went down to the local GoodWill computer store and
bought a bag full of mice, keyboards, and joy(devices). I did that
because I was planning to work on the multidevice support in X for SDL
1.3. Well, the project has been put off and delayed because the X.org
people have been rethinking the same problems.

Their solutions are included in XInput extension 2.0 and in the MPX
(MultiPointer X) extension. These extension are in the X.org 1.7 X
server that was just released a couple of days ago. Over the next 6
months that server should show up in main stream Linux distributions.
XInput 2.0 is supposed to finally do what XInput never did. It is
supposed to provide a generic interface to all interactive devices and
to support true hot plugging with events to tell when a device has
been plugged in and when one has been removed. MPX is supposed to
solve several problems with touch screen input and to allow multiple
mice to control multiple separate screen pointers. It allows a
keyboard and a mouse to be bound together so that several people all
using their own mice and keyboards can share a single screen with each
person being able to point and click and type with their own input
devices.

Now that we will soon be able to do all these things it seems like a
good time to talk about the changes needed in SDL to support the new
capabilities for input that we are now in the major operating systems.

A few obvious questions:

  1. Can the existing SDL multitouch interface for the iPhone be adapted
    to the other major OSes?
  2. I believe be need device added and device removed events. Do we
    really? And if so, how to we squeeze them in? What information must
    the contain?
  3. Joysticks versus everything else. Right now the joystick API is
    different from the mouse/keyboard API. There are very good reasons for
    that. Are those reasons still valid? Can we (should we even try) to
    create a single device API for all input devices?
  4. Weird devices: When I run “xinput --list” I see that the current
    XInput extension considers the power button to be a keyboard. In fact,
    it finds two of them and I can only find one! I suspect that the new
    XInput is going to also see, and expose, lots of other odd devices.
    What, if anything, should SDL do about those? I might really want to
    get an event telling me the power button has been pushed…

What do the other major and minor OSes do about weird devices,
joysticks, and hot plugging? I’ve only been looking at X so I would
like to get feed back about how this is handled on Window and Mac OS X
and other OSes.

Bob Pendleton–
±----------------------------------------------------------

I’d like to see something like this:

typedef enum SDL_ControlType {
SDL_CONTROLTYPE_NONE,

/** any button-like control (joystick button, keyboard key) */
SDL_CONTROLTYPE_BUTTON,

/** definite minimum and maximum value, reports position (joystick,
throttle, pressure sensor) */
SDL_CONTROLTYPE_ABSOLUTEAXIS,

/** no minimum or maximum value, reports movement (mouse, mouse wheel) */
SDL_CONTROLTYPE_RELATIVEAXIS,

/** text input. May be associated with a set of buttons (keyboard) */
SDL_CONTROLTYPE_TEXT,

/** a mouse, tablet, light gun or touch screen pointer. May be
associated with two or more relative axes */
SDL_CONTROLTYPE_POINTER,

/** generally a set of four buttons representing up, down, left and
right. For a four button hat, a maximum of two buttons can be pressed
at any one time. */
SDL_CONTROLTYPE_HAT,

/** a set of two or more absolute axes used to indicate a single
position or direction */
SDL_CONTROLTYPE_JOYSTICK,

/* … */

SDL_CONTROLTYPE_OTHER
} SDL_ControlType;

typedef struct SDL_ControlHeader {
SDL_Control* next;
SDL_ControlType type;

/** A printable descriptive name for this control /
char
name;

/** Indicates whether control represents a virtual control which
combines input from other device controls.

  • Always SDL_TRUE for text, pointer, hat and joystick controls.
    */
    SDL_BOOL isVirtual;

/** Indicates whether a control is static or dynamic.

  • Static controls are never removed until the device itself is removed.
    */
    SDL_BOOL isStatic;
    } SDL_ControlHeader;

typedef union SDL_Control {
SDL_ControlHeader header;
/* … */
} SDL_Control;

typedef enum SDL_DeviceType {
SDL_DEVICETYPE_NONE,
SDL_DEVICETYPE_KEYBOARD,
SDL_DEVICETYPE_JOYSTICK,
SDL_DEVICETYPE_MOUSE,
SDL_DEVICETYPE_MULTIPOINTER,
/* … */
SDL_DEVICETYPE_OTHER
} SDL_DeviceType;

typedef struct SDL_DeviceHeader {
SDL_Device* next;
SDL_DeviceType type;

/** A printable descriptive name for this device /
char
name;

/** Indicates whether control represents a virtual device.

  • A virtual device may be a simulation, or may combine input from
    other devices.
  • The root keyboard and mouse on windows would be virtual controls.
    */
    SDL_BOOL isVirtual;

/** Indicates whether a device is static or dynamic.

  • Static devices are never removed.
  • These are generally system devices which combine keyboard and
    mouse input from several other devices.
    */
    SDL_BOOL isStatic;

Uint32 numStaticControls;
Uint32 maxDynamicControls;
Uint32 numDynamicControls;

/** Pointer to first static control.

  • Static controls should be stored as an array and linked list,
    followed by dynamic controls which continue the linked list after the
    last static control.
  • The first dynamic control is at controls[numStaticControls-1].next.
    /
    SDL_Control
    controls;
    };

typedef union SDL_Device {
SDL_DeviceHeader header;
/* … */
} SDL_Device;

Uint32 SDL_DeviceID;
Uint32 SDL_ControlID;

Then add SDL_HOTPLUG and SDL_INPUT events:

typedef enum SDL_HotplugEventType {
/** Sent when a device is first added to the system or when SDL
starts as devices are enumerated */
SDL_HOTPLUG_DEVICEADDED,

/** Sent when a device is removed from the system after each control
has been removed */
SDL_HOTPLUG_DEVICEREMOVED,

/** Sent when device is being removed, prior to removing each of
it’s controls */
SDL_HOTPLUG_DEVICELOCKED,

/** Sent once the device itself and all of it’s static controls have
been initialized */
SDL_HOTPLUG_DEVICEREADY,

/** Sent when a control is added to the device or when static
controls are enumerated */
SDL_HOTPLUG_CONTROLADDED,

/** Sent when a dynamic control is removed or when the associated
device is being removed */
SDL_HOTPLUG_CONTROLREMOVED

} SDL_HotplugEventType;

typedef struct SDL_HotplugEvent {
Uint32 type; /**< always SDL_HOTPLUG */
SDL_DeviceID device;
SDL_ControlID control;
SDL_HotplugEventType event;
};

typedef struct SDL_InputEventHeader {
Uint32 type; /< always SDL_INPUT */
SDL_DeviceID device;
SDL_ControlID control;
Uint32 event; /
< interpretation varies by control type */
};

typedef union SDL_InputEvent {
SDL_InputEventHeader header;
/* Control type specific structures go here… */
};

A multitouch screen could then be implemented by adding and removing
pointer controls.

Biggest annoyance I see here is that keyboards would require a large
number of static controls (101+ buttons and a text control). I doubt
that would be much of an issue though.

This could also allow you to create your own virtual devices, which
would be handy if you wanted to, for example, play back recorded input
from a file (using SDL_AddDevice, SDL_AddControl, SDL_SendInput,
SDL_RemoveControl, SDL_RemoveDevice).

  1. Can the existing SDL multitouch interface for the iPhone be adapted
    to the other major OSes?
  2. I believe be need device added and device removed events. Do we
    really? And if so, how to we squeeze them in? What information must
    the contain?

I believe most of this work was done for last year’s Summer of Code.

  1. Joysticks versus everything else. Right now the joystick API is
    different from the mouse/keyboard API. There are very good reasons for
    that. Are those reasons still valid? Can we (should we even try) to
    create a single device API for all input devices?

No. It’s a leaky abstraction at best.

  1. Weird devices: When I run “xinput --list” I see that the current
    XInput extension considers the power button to be a keyboard. In fact,
    it finds two of them and I can only find one! I suspect that the new
    XInput is going to also see, and expose, lots of other odd devices.
    What, if anything, should SDL do about those? I might really want to
    get an event telling me the power button has been pushed…

Apparently the x.org developers learned nothing from ALSA. :slight_smile:

–ryan.

  1. Can the existing SDL multitouch interface for the iPhone be adapted
    to the other major OSes?
  2. I believe be need ?device added and device removed events. Do we
    really? And if so, how to we squeeze them in? What information must
    the contain?

I believe most of this work was done for last year’s Summer of Code.

Where is it? I’ve looked around in gsoc 2008 and 2009 and I don’t see
events for adding or removing devices.

  1. Joysticks versus everything else. Right now the joystick API is
    different from the mouse/keyboard API. There are very good reasons for
    that. Are those reasons still valid? Can we (should we even try) to
    create a single device API for all input devices?

No. It’s a leaky abstraction at best.

Ahh, “no” to which question? I believe the APIs should be different
because people think of them as very different kinds of things. OTOH,
I believe it is pretty simple to describe all input devices as
collections of valuators where there are about three different value
ranges for valuators. But, people don’t think about the devices that
way so I believe we should do what works for people.

  1. Weird devices: When I run “xinput --list” I see that the current
    XInput extension considers the power button to be a keyboard. In fact,
    it finds two of them and I can only find one! I suspect that the new
    XInput is going to also see, and expose, lots of other odd devices.
    What, if anything, should SDL do about those? I might really want to
    get an event telling me the power button has been pushed…

Apparently the x.org developers learned nothing from ALSA. ?:slight_smile:

Could you expand on that?

Bob PendletonOn Sun, Oct 4, 2009 at 1:15 PM, Ryan C. Gordon wrote:

–ryan.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


±----------------------------------------------------------

2009/10/4 Bob Pendleton :

  1. Can the existing SDL multitouch interface for the iPhone be adapted
    to the other major OSes?
  2. I believe be need ?device added and device removed events. Do we
    really? And if so, how to we squeeze them in? What information must
    the contain?

I believe most of this work was done for last year’s Summer of Code.

Where is it? I’ve looked around in gsoc 2008 and 2009 and I don’t see
events for adding or removing devices.

I started on this, but ran into some problems with the mouse API. It
kinda falls apart when you try to remove a device. It looks like
there was some attempt to add support for adding and removing mice
before then, but since it was never actually used, I’m not sure if it
was killed off by patches or never worked in the first place. Either
way, it doesn’t seem very well thought out.

  1. Joysticks versus everything else. Right now the joystick API is
    different from the mouse/keyboard API. There are very good reasons for
    that. Are those reasons still valid? Can we (should we even try) to
    create a single device API for all input devices?

No. It’s a leaky abstraction at best.

Joysticks are traditionally polled, whereas mice and keyboards are
traditionally event based. It seems joystick support is almost an
afterthought on most systems. Mouse and keyboard is used by almost
all applications, but joystick is typically used only for games, so
mouse and keyboard is implemented first, then joysticks are tacked on
as a separate API… So yeah, there are reasons why it’s separate,
but I don’t see any reason to keep it separate any longer.

Ahh, “no” to which question? I believe the APIs should be different
because people think of them as very different kinds of things. OTOH,
I believe it is pretty simple to describe all input devices as
collections of valuators where there are about three different value
ranges for valuators. But, people don’t think about the devices that
way so I believe we should do what works for people.

I believe changing a binding from a joystick button to keyboard key or
vice versa should involve changing no more than two variables (device
id and button id). I see no reason to force some huge change in event
handling for such a simple change in program behaviour.> On Sun, Oct 4, 2009 at 1:15 PM, Ryan C. Gordon wrote:

2009/10/4 Bob Pendleton <@Bob_Pendleton>:

  1. Can the existing SDL multitouch interface for the iPhone be adapted
    to the other major OSes?
  2. I believe be need ?device added and device removed events. Do we
    really? And if so, how to we squeeze them in? What information must
    the contain?

I believe most of this work was done for last year’s Summer of Code.

Where is it? I’ve looked around in gsoc 2008 and 2009 and I don’t see
events for adding or removing devices.

I started on this, but ran into some problems with the mouse API. ?It
kinda falls apart when you try to remove a device. ?It looks like
there was some attempt to add support for adding and removing mice
before then, but since it was never actually used, I’m not sure if it
was killed off by patches or never worked in the first place. ?Either
way, it doesn’t seem very well thought out.

Yeah, I’ll second that.

Right now if you have two mice plugged in to a Linux system and use
them with an SDL application they will work. But, if you unplug one
wile the applications is running you will get errors when you close
the application because SDL will try to close the unplugged device. If
you plug in a mouse while SDL is running you will get focus events
from it, but no motion events.

Currently 1.3 had no support for multiple keyboards. If you have them
plugged in all the events come from one device id.

  1. Joysticks versus everything else. Right now the joystick API is
    different from the mouse/keyboard API. There are very good reasons for
    that. Are those reasons still valid? Can we (should we even try) to
    create a single device API for all input devices?

No. It’s a leaky abstraction at best.

Joysticks are traditionally polled, whereas mice and keyboards are
traditionally event based. ?It seems joystick support is almost an
afterthought on most systems. ?Mouse and keyboard is used by almost
all applications, but joystick is typically used only for games, so
mouse and keyboard is implemented first, then joysticks are tacked on
as a separate API… ?So yeah, there are reasons why it’s separate,
but I don’t see any reason to keep it separate any longer.

The reason is psychology. People see them as being mice, keyboards,
and joysticks. OTOH, I think the APIs should follow the same pattern
as much as possible.

Ahh, “no” to which question? I believe the APIs should be different
because people think of them as very different kinds of things. OTOH,
I believe it is pretty simple to describe all input devices as
collections of valuators where there are about three different value
ranges for valuators. But, people don’t think about the devices that
way so I believe we should do what works for people.

I believe changing a binding from a joystick button to keyboard key or
vice versa should involve changing no more than two variables (device
id and button id). ?I see no reason to force some huge change in event
handling for such a simple change in program behaviour.

Took me a minute to figure out what you are saying here. If I
understand correctly you believe that since all device events should
look the same I should be able to just change the device id and button
id in the event and then hand it off to my preexisting input
handling code. I see where you are coming from: there is no logical
difference between the buttons on a joystick and the keys on a
keyboard.

The problem you run into with that approach is the difference in
roles. The keys on a joystick don’t play a role in creating text
input. While the keys on a keyboard do get used to create text input.

Bob PendletonOn Sun, Oct 4, 2009 at 3:56 PM, Kenneth Bull wrote:

On Sun, Oct 4, 2009 at 1:15 PM, Ryan C. Gordon wrote:


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


±----------------------------------------------------------

2009/10/5 Bob Pendleton :

I believe changing a binding from a joystick button to keyboard key or
vice versa should involve changing no more than two variables (device
id and button id). ?I see no reason to force some huge change in event
handling for such a simple change in program behaviour.

Took me a minute to figure out what you are saying here. If I
understand correctly you believe that since all device events should
look the same I should be able to just change the device id and button
id in the event and then hand it off to my preexisting input
handling code. I see where you are coming from: there is no logical
difference between the buttons on a joystick and the keys on a
keyboard.

The problem you run into with that approach is the difference in
roles. The keys on a joystick don’t play a role in creating text
input. While the keys on a keyboard do get used to create text input.

The solution there would be to create a separate control to retrieve
the text input.
A keyboard then would have a set of named button controls (‘A’, ‘B’,
‘1’, ‘ESCAPE’, etc) for keypress/release events and a text stream
control for text input events (or a generic packet data stream if you
want to use the same sort of thing for networking).

For example, a user presses the ‘A’ key on device 1 (root keyboard
perhaps), holds it down, then releases it. The events sent look like
this:

event.type = SDL_INPUT_EVENT;
event.input.deviceID = 1;
event.input.controlID = SDL_GetControlByName(1, “a”);
event.input.event = SDL_BUTTON_DOWN_EVENT;

event.type = SDL_INPUT_EVENT;
event.input.deviceID = 1;
event.input.controlID = SDL_GetControlByName(1, “TEXT”);
event.input.event = SDL_STREAM_RECV_EVENT;
event.input.recv.length = 1;
event.input.recv.data = strdup(“a”);
/* data may be freed on the next call to SDL_PollEvent; user is
expected to make their own copy if they need it longer than that (not
really necessary here, but would be for network or pipe streams). */

/* SDL_STREAM_RECV_EVENT repeats while key is held down */

event.type = SDL_INPUT_EVENT;
event.input.deviceID = 1;
event.input.controlID = SDL_GetControlByName(1, “a”);
event.input.event = SDL_BUTTON_UP_EVENT;

2009/10/5 Bob Pendleton <@Bob_Pendleton>:

I believe changing a binding from a joystick button to keyboard key or
vice versa should involve changing no more than two variables (device
id and button id). ?I see no reason to force some huge change in event
handling for such a simple change in program behaviour.

Took me a minute to figure out what you are saying here. If I
understand correctly you believe that since all device events should
look the same I should be able to just change the device id and button
id in the event and then hand it off to my preexisting input
handling code. I see where you are coming from: there is no logical
difference between the buttons on a joystick and the keys on a
keyboard.

The problem you run into with that approach is the difference in
roles. The keys on a joystick don’t play a role in creating text
input. While the keys on a keyboard do get used to create text input.

The solution there would be to create a separate control to retrieve
the text input.
A keyboard then would have a set of named button controls (‘A’, ‘B’,
‘1’, ‘ESCAPE’, etc) for keypress/release events and a text stream
control for text input events (or a generic packet data stream if you
want to use the same sort of thing for networking).

For example, a user presses the ‘A’ key on device 1 (root keyboard
perhaps), holds it down, then releases it. ?The events sent look like
this:

event.type = SDL_INPUT_EVENT;
event.input.deviceID = 1;
event.input.controlID = SDL_GetControlByName(1, “a”);
event.input.event = SDL_BUTTON_DOWN_EVENT;

event.type = SDL_INPUT_EVENT;
event.input.deviceID = 1;
event.input.controlID = SDL_GetControlByName(1, “TEXT”);
event.input.event = SDL_STREAM_RECV_EVENT;
event.input.recv.length = 1;
event.input.recv.data = strdup(“a”);
/* data may be freed on the next call to SDL_PollEvent; user is
expected to make their own copy if they need it longer than that (not
really necessary here, but would be for network or pipe streams). */

/* SDL_STREAM_RECV_EVENT repeats while key is held down */

event.type = SDL_INPUT_EVENT;
event.input.deviceID = 1;
event.input.controlID = SDL_GetControlByName(1, “a”);
event.input.event = SDL_BUTTON_UP_EVENT;

You have clearly spent a lot of time think about this. Have you
written it up somewhere? And by that I mean have you written up a
document that explains your approach and contrasts it to existing
approaches? I doubt SDL will be changed to look like this, but you
never know… And, I would really like to understand your way of
looking at this problem.

Bob PendletonOn Mon, Oct 5, 2009 at 4:17 PM, Kenneth Bull wrote:


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


±----------------------------------------------------------

2009/10/5 Bob Pendleton :

You have clearly spent a lot of time think about this. Have you
written it up somewhere? And by that I mean have you written up a
document that explains your approach and contrasts it to existing
approaches? I doubt SDL will be changed to look like this, but you
never know… And, I would really like to understand your way of
looking at this problem.

It would be a big change and completely incompatible with existing
code, which does make it unlikely without a compatibility layer…

I haven’t really written anything on this, though occasionally bits
and pieces find their way into SDL wrappers and such. I’ll try
writing a standalone library to demonstrate… it’ll be Windows only
though for now.

Where is it? I’ve looked around in gsoc 2008 and 2009 and I don’t see
events for adding or removing devices.

Hmm, perhaps it wasn’t done. I’ll have to check.

No. It’s a leaky abstraction at best.

Ahh, “no” to which question?

No to having a unified API for all input devices. There’s no good reason
for it, and it doesn’t make sense in practice anyhow. A keyboard is a
keyboard, a mouse is a mouse, and a joystick is a joystick, and the user
doesn’t expect (or even want) them to be interchangeable in most cases.
For places where we do want a small amount of overlap (keyboard bindings
to match joystick controls), the amount of programmer effort is
minimal…and probably less code for both the application and SDL than
trying to tapdance around the abstraction.

Apparently the x.org developers learned nothing from ALSA. :slight_smile:

Could you expand on that?

My Sound Blaster Live card, under ALSA, looks like 18 different devices,
when all I wanted was to adjust the volume. Some of them, like your
missing power button, don’t actually do anything.

More flexibility and abstraction isn’t always a good thing, is all I’m
saying.

Perhaps there are arguments for having these sort of interfaces at the
lowest level, since the desktop is going to want to pop up a shutdown
confirmation dialog when you press the power button, but it doesn’t
benefit any application to have to find the “power button” device and
figure out when it’s been pushed, when a single event that has nothing
do with the specific hardware is more useful.

–ryan.

Where is it? I’ve looked around in gsoc 2008 and 2009 and I don’t see
events for adding or removing devices.

Hmm, perhaps it wasn’t done. I’ll have to check.

No. It’s a leaky abstraction at best.

Ahh, “no” to which question?

No to having a unified API for all input devices. There’s no good reason for
it, and it doesn’t make sense in practice anyhow. A keyboard is a keyboard,
a mouse is a mouse, and a joystick is a joystick, and the user doesn’t
expect (or even want) them to be interchangeable in most cases. For places
where we do want a small amount of overlap (keyboard bindings to match
joystick controls), the amount of programmer effort is minimal…and
probably less code for both the application and SDL than trying to tapdance
around the abstraction.

I actually agree with you. People see them as separate classes of
things so they should be treated as separate classes of things. OTOH,
by the time a programmer has spent time thinking about a unified model
for joysticks programmers start trying to fit everything into that
model. Let me run this past you. Do we need more classes of devices
than just mouse, keyboard, and joystick? Does a pistol fit the model
of a joystick? Does a steering wheel + gas pedal gizmo fit the model
users mental model of a of a joystick? What about a Wiimote? Do we
need to broaden our thinking about the classes of input devices?

Apparently the x.org developers learned nothing from ALSA. ?:slight_smile:

Could you expand on that?

My Sound Blaster Live card, under ALSA, looks like 18 different devices,
when all I wanted was to adjust the volume. Some of them, like your missing
power button, don’t actually do anything.

More flexibility and abstraction isn’t always a good thing, is all I’m
saying.

Perhaps there are arguments for having these sort of interfaces at the
lowest level, since the desktop is going to want to pop up a shutdown
confirmation dialog when you press the power button, but it doesn’t benefit
any application to have to find the “power button” device and figure out
when it’s been pushed, when a single event that has nothing do with the
specific hardware is more useful.

Very good point. Looks like we need to filter the devices that are
made visible the end user. Right now my X server reports a virtual
mouse that is just used to emulate Mac mouse buttons. Only physical
devices should be passed on through SDL. And, only those devices that
make sense for applications.

Bob PendletonOn Tue, Oct 6, 2009 at 7:56 AM, Ryan C. Gordon wrote:

–ryan.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


±----------------------------------------------------------

than just mouse, keyboard, and joystick? Does a pistol fit the model
of a joystick? Does a steering wheel + gas pedal gizmo fit the model
users mental model of a of a joystick? What about a Wiimote? Do we
need to broaden our thinking about the classes of input devices?

That’s a good question. Let me think about that awhile.

–ryan.

I just got a Taiko drum for my PSX! :slight_smile: I was about to hook it up to my
laptop via a PSX->USB converter and see what it looked like.On Wed, Oct 07, 2009 at 09:25:49PM -0400, Ryan C. Gordon wrote:

That’s a good question. Let me think about that awhile.


-bill!
Sent from my computer

And FYI:

Namco Taiko controller via PSX->USB adapter

Joystick values in Linux:

main drum left side = axes 0 @ -32767
main drum right side = button 1 on
drum edge left = button 6 on
drum edge right = button 7 on
start = button 9 on
select = button 8 on

-bill!On Wed, Oct 07, 2009 at 06:31:24PM -0700, Bill Kendrick wrote:

I just got a Taiko drum for my PSX! :slight_smile: I was about to hook it up to my
laptop via a PSX->USB converter and see what it looked like.

2009/10/7 Bob Pendleton :

I actually agree with you. People see them as separate classes of
things so they should be treated as separate classes of things. OTOH,
by the time a programmer has spent time thinking about a unified model
for joysticks programmers start trying to fit everything into that
model. Let me run this past you. Do we need more classes of devices
than just mouse, keyboard, and joystick? Does a pistol fit the model
of a joystick? Does a steering wheel + gas pedal gizmo fit the model
users mental model of a of a joystick? What about a Wiimote? Do we
need to broaden our thinking about the classes of input devices?

You can’t cover every possible device. You can however cover the vast
majority of controls…

A pistol is a pointer (two axes) and a button, a steering wheel is an
axis and a pressure sensor (which you can treat as an axis), a wiimote
is three accelerometers (again, can be treated as axes) and a batch of
buttons, a mouse is a pair of axes and a set of buttons, a keyboard is
a set of buttons (from which you produce a data stream from the ids of
pressed buttons), a multitouch system is a set of pointers and
pressure sensors (axes again), …

There isn’t much that can’t be described as an axis or button, and if
all else fails, you can simply describe it as a data stream.

An accurate drum set is probably one of the more difficult devices
actually, since you want to record how hard/fast you hit it. But
that’s either an accelerometer or an amplitude measurement from a
microphone (axes), and most just end up buttons.

I’d much rather write code for two or three different control types
than a few dozen devices.

Also, it kinda bugs me when I have a game I can control only with a
specific device. I much prefer games where I can map any action to
any control on any device. Not everyone wants WASD or arrow keys.
Making keyboard keys, mouse buttons and joystick buttons the same
makes it much easier to switch between them. Even axes can be
buttonified if you make a virtual button which is pressed when the
axis is positive, and another which is pressed when the axis is
negative.

Very good point. Looks like we need to filter the devices that are
made visible the end user. Right now my X server reports a virtual
mouse that is just used to emulate Mac mouse buttons. Only physical
devices should be passed on through SDL. And, only those devices that
make sense for applications.

You do want some sort of filter so you don’t end up assigning “move
left” or “fire” to the same button that shuts down the computer, but a
programmer may really want access to that power button… Basically,
filters are good, but only if there’s a way around them.

2009/10/7 Bill Kendrick :> On Wed, Oct 07, 2009 at 06:31:24PM -0700, Bill Kendrick wrote:

Joystick values in Linux:

main drum left side = axes 0 @ -32767
main drum right side = button 1 on
drum edge left = button 6 on
drum edge right = button 7 on
start = button 9 on
select = button 8 on

why would they map it to an axis?

Any idea if PSX is USB with a weird connector, or is the adapter
assigning these values?

The hid spec ( http://www.usb.org/developers/hidpage/ ) probably has a
specific usage for drums too, though I can understand why they
wouldn’t use that here. No one really wants to write drivers for
every possible variation of every possible device.

2009/10/7 Bill Kendrick :

Joystick values in Linux:

main drum left side = axes 0 @ -32767
main drum right side = button 1 on
drum edge left = button 6 on
drum edge right = button 7 on
start = button 9 on
select = button 8 on

why would they map it to an axis?

Any idea if PSX is USB with a weird connector, or is the adapter
assigning these values?

No, PS1/PS2 controllers are not USB, the USB adapters assign abritary
actions that vary from one adapter brand to another (they’re all
unlicensed peripherals anyway so there is no official standard). In
this case the “main drum left side” is most likely assigned to axis 0
because the drum has bound it to pushing “left” on the D-pad, and the
adapter binds the D-pad to the X/Y axis.

The hid spec ( http://www.usb.org/developers/hidpage/ ) probably has a
specific usage for drums too, though I can understand why they
wouldn’t use that here. ?No one really wants to write drivers for
every possible variation of every possible device.

The USB adapter has no way to know what kind of special controller it
is. It only sees normal controllers, and binds actions accordingly.
This is how all PS1/PS2 special controllers work (Taiko drums, GH
guitars, the Beatmania pad, etc.), they just bind their functions to
existing controller actions, and the appropriate game knows what
controller actions to look for. Ideally they use action assignments
that also make it possible to use a normal controller to play the game
without being too weird.On Thu, Oct 8, 2009 at 01:05, Kenneth Bull wrote:

On Wed, Oct 07, 2009 at 06:31:24PM -0700, Bill Kendrick wrote:

  • SR

why would they map it to an axis?

No idea. I haven’t messed with the device in any other games
(i.e., something that uses a regular Dual-shock or whatever)

Any idea if PSX is USB with a weird connector, or is the adapter
assigning these values?

Reading a follow-up to this… apparently.

Now, the thing I notice is that the drum is VERY insensitive when I use
it under Linux. I need to bash or mangle it really hard to get it to
register. Much moreso than when using it in Taiko Drum Master on the PS2.

Of course, the game is a PS2 game, so the controller is for PS2.
And IIRC, the PS2’s fire buttons added pressure sensitivity (something
the PSX did not have). Considering this PSX->USB adapter is for PSX
controllers, and not necessarily PS2, perhaps that’s the cause of this
sensitivity issue.

Which is sad… I’d love to try to write a little game (if anything,
for my toddler son to bang on). For now we’ll have to just live with the
game that came with the controller. (Err, I guess it was vice-versa, but
anyway :wink: )

-bill!On Thu, Oct 08, 2009 at 01:05:11AM -0400, Kenneth Bull wrote:

Actually, the Taiko Drum gets detected by the game on the PS2.
I can play it two players - one user using a drum, the other using
a dual-shock controller, and it knows they are different.
(The UI shows what to do to control menus/etc. On the left, it shows
you’re supposed to bang the center for OK, or the left/right sides to
change menuitems. On the right, it shows a d-pad for menu and an “X” button
for OK.)

So, again, somehow the PS2 can detect drum vs. dpad. I’m betting this
ancient PSX->USB adapter I have cannot. (At least, not using the standard
input drivers that Linux throws at it. It just sees “N-axes joystick”…
probably even if nothing is connected to the PSX end of the cable.)

-bill!On Thu, Oct 08, 2009 at 10:05:00AM -0400, Simon Roby wrote:

The USB adapter has no way to know what kind of special controller it
is. It only sees normal controllers, and binds actions accordingly.
This is how all PS1/PS2 special controllers work (Taiko drums, GH
guitars, the Beatmania pad, etc.), they just bind their functions to
existing controller actions, and the appropriate game knows what
controller actions to look for. Ideally they use action assignments
that also make it possible to use a normal controller to play the game
without being too weird.

I just got a Taiko drum for my PSX! :slight_smile: ?I was about to hook it up to my
laptop via a PSX->USB converter and see what it looked like.

And FYI:

Namco Taiko controller via PSX->USB adapter

Joystick values in Linux:

main drum left side = axes 0 @ -32767
main drum right side = button 1 on
drum edge left = button 6 on
drum edge right = button 7 on
start = button 9 on
select = button 8 on

Which utility did you use to get this info?

Bob PendletonOn Wed, Oct 7, 2009 at 8:38 PM, Bill Kendrick wrote:

On Wed, Oct 07, 2009 at 06:31:24PM -0700, Bill Kendrick wrote:

-bill!


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


±----------------------------------------------------------