Possible problem in the curent joystick code

Hi!

I started investigating why some joysticks seem to have a deadzone in DosBox and others don’t. While digging around, I found something that looked wrong to me.

I’ll apologize in advance if this has been covered somewhere.

In SDL source code 1.2.14, in the src\joystick\win32\SDL_mmjoystick.c file, there’s a section of code that initializes the joystick information:

Code:

for ( i = 0; i < MAX_AXES; ++i ) {
	if ( (i<2) || (SYS_Joystick[index].wCaps & caps_flags[i-2]) ) {
		joystick->hwdata->transaxis[i].offset =
			AXIS_MIN - axis_min[i];
		joystick->hwdata->transaxis[i].scale =
			(float)(AXIS_MAX - AXIS_MIN) / (axis_max[i] - axis_min[i]);
	} else {
		joystick->hwdata->transaxis[i].offset = 0;
		joystick->hwdata->transaxis[i].scale = 1.0; /* Just in case */
	}
}

Note that the offset is calculated before the scale. When processing joystick inputs, the offset is added to the windows-reported position, and the scale is then applied.

Code:
value = (int)(((float)pos[i] + transaxis[i].offset) * transaxis[i].scale);

I think this generates really bad values. Consider a joystick that presents itself at +/- 10,000. SDL converts it all to +/- 32,767. So the above code generates an offset of -22,767 and a scale of 3.28.

Plugging in a real coordinate of 0 should generate an output coordinate of 0. But using the second equation, it’s -22,767 * 3.28 or, basically, too large to deal with.

This might explain why games such as Privateer, start with some joysticks stuck in the upper left and why the calibration routine fails.

I believe the scale needs to be applied to the minaxis value before calculating the offset and then the real data value needs to be scaled before applying the offset.

Using those rules, the offset would be 0 and the scale as above. This makes 0 equal to 0.

So, the updated code would be:

Code:

for ( i = 0; i < MAX_AXES; ++i ) {
	if ( (i<2) || (SYS_Joystick[index].wCaps & caps_flags[i-2]) ) {
		joystick->hwdata->transaxis[i].scale =
			(float)(AXIS_MAX - AXIS_MIN) / (axis_max[i] - axis_min[i]);
		joystick->hwdata->transaxis[i].offset =
			AXIS_MIN - (axis_min[i] * oystick->hwdata->transaxis[i].scale);
	} else {
		joystick->hwdata->transaxis[i].offset = 0;
		joystick->hwdata->transaxis[i].scale = 1.0; /* Just in case */
	}
}

and

Code:
value = (int)(((float)pos[i] * transaxis[i].scale) + transaxis[i].offset);

Now… as far as I can tell, all tested joysticks in my Win7 x64 environment use 0-64K, so the current code happens to work (because there is no scaling necessary), but I still think that code is buggy and should be changed.