GSoC 2008 - SDL 1.3 - Nintendo DS

A brief summary of what’s been done on the gsoc2008_nds branch thus far:

After a wild goose chase searching for a mysterious bug in the graphics
driver, a simple 2D software renderer has been added using the SDL 1.3
graphics driver paradigm. This driver uses the Nintendo DS framebuffer
and is largely unoptimized; its main use for now is to provide some kind
of functional video output while other features are added. From the
looks of it, the design of SDL 1.3 video drivers seems to be consistent
with some of the ideas I’ve had for implementing a DS-optimized driver,
and should make it easier than it would’ve been to make such an
optimized driver for SDL 1.2.

Other future plans regarding work on the NDS video driver include some
way for the OpenGL-ish 3D interface provided by libnds to be able to be
used in tandem with SDL; this probably means that I should implement
another driver that uses said GL-ish interface to do the actual
rendering, as the existing OpenGL drivers for SDL already do. (Perhaps
I could even simply add some code to those existing GL drivers instead!)

Beyond video, I’ve duplicated the joystick functionality from the 1.2
port with a few minor changes/fixes to work well and as one would
expect. From there, I used the existing joystick implementation as a
skeleton for a touchscreen API, so that it would be consistent with the
joystick API in terms of use. It’s still rough around the edges and I’m
currently working on refining the way it reports events – anyone who’s
seen the footage of my touch demo (and who has very good eyes to see
through the poor quality) might be able to tell that the SDL event loop
I used didn’t always catch the “down” and “released” actions.

In designing the touch API, I tried to keep multi-touch in mind. I
don’t have any multi-touch capable devices, so I have no real means of
testing my designs, but nonetheless I have the code able to allow for an
arbitrary amount of touches, given any maximum number of touches. The
coordinates given by the touch API for each touch point are
(x,y,pressure), where x and y are signed integers between -32768 and
32767, and (0,0) is the center of the screen; pressure is an unsigned
integer between 1 and 65535 when the screen is being touched and 0 when
it is not being touched. This should hopefully provide sufficient
precision when larger and more precise touchscreens are available, so
long as the machines can handle 16-bit integer arithmetic.

I also plan to have some sort of flag that enables “mouse emulation” by
the SDL touchscreen API, where it would report mouse events rather than
touchscreen events. This should allow for better compatibility with
legacy programs. Thoughts on this?

Also, here are a couple of (poor quality, sorry!) videos showing some
test programs running on the hardware.



Darren
http://lifning.americankryptonite.net/blag/

Is there a specific reason that the center of the screen is the origin (0, 0)? It doesn’t seem right to have to convert your touch coords to app coords using the screen size when the origin could just be the top left. Another thing to note is the ‘center’ of the screen is subpixel since most resolutions are even numbers. This imposes an arbitrary decision of which of the four center pixels represents (0, 0).

Jonny D----------------------------------------

Date: Fri, 27 Jun 2008 06:43:03 -0400
From: dalton at stevens.edu
To: sdl at lists.libsdl.org
Subject: [SDL] GSoC 2008 - SDL 1.3 - Nintendo DS

A brief summary of what’s been done on the gsoc2008_nds branch thus far:

After a wild goose chase searching for a mysterious bug in the graphics
driver, a simple 2D software renderer has been added using the SDL 1.3
graphics driver paradigm. This driver uses the Nintendo DS framebuffer
and is largely unoptimized; its main use for now is to provide some kind
of functional video output while other features are added. From the
looks of it, the design of SDL 1.3 video drivers seems to be consistent
with some of the ideas I’ve had for implementing a DS-optimized driver,
and should make it easier than it would’ve been to make such an
optimized driver for SDL 1.2.

Other future plans regarding work on the NDS video driver include some
way for the OpenGL-ish 3D interface provided by libnds to be able to be
used in tandem with SDL; this probably means that I should implement
another driver that uses said GL-ish interface to do the actual
rendering, as the existing OpenGL drivers for SDL already do. (Perhaps
I could even simply add some code to those existing GL drivers instead!)

Beyond video, I’ve duplicated the joystick functionality from the 1.2
port with a few minor changes/fixes to work well and as one would
expect. From there, I used the existing joystick implementation as a
skeleton for a touchscreen API, so that it would be consistent with the
joystick API in terms of use. It’s still rough around the edges and I’m
currently working on refining the way it reports events – anyone who’s
seen the footage of my touch demo (and who has very good eyes to see
through the poor quality) might be able to tell that the SDL event loop
I used didn’t always catch the “down” and “released” actions.

In designing the touch API, I tried to keep multi-touch in mind. I
don’t have any multi-touch capable devices, so I have no real means of
testing my designs, but nonetheless I have the code able to allow for an
arbitrary amount of touches, given any maximum number of touches. The
coordinates given by the touch API for each touch point are
(x,y,pressure), where x and y are signed integers between -32768 and
32767, and (0,0) is the center of the screen; pressure is an unsigned
integer between 1 and 65535 when the screen is being touched and 0 when
it is not being touched. This should hopefully provide sufficient
precision when larger and more precise touchscreens are available, so
long as the machines can handle 16-bit integer arithmetic.

I also plan to have some sort of flag that enables “mouse emulation” by
the SDL touchscreen API, where it would report mouse events rather than
touchscreen events. This should allow for better compatibility with
legacy programs. Thoughts on this?

Also, here are a couple of (poor quality, sorry!) videos showing some
test programs running on the hardware.
http://youtube.com/watch?v=SBYHTfK9Wec
http://youtube.com/watch?v=7zxyNtDpkqE


Darren
http://lifning.americankryptonite.net/blag/


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Fair point. My reasoning for scaling it to such a system is that the
video resolution can be different than the normal touch screen
resolution; that is, an 800x600 touchscreen device might have an
application running stretched at 400x300 on it… There wasn’t a real
strong reason for making the center the origin besides it looking
mathematically appealing at the time, so I’ll certainly consider making
it the top-left, and just be in the range [0,65535].

Perhaps there should be “application” and “system” coordinates given by
the events, though? Like, one that’s presented in terms of the current
video resolution, and one that’s consistently represented within a fixed
range of integers to give the position in terms of the hardware screen
itself.

Another thing I guess is worth considering is that the touch hardware
doesn’t always give its touch position in terms of pixels. As you said,
often enough the “center” is sub-pixel. The (x,y) coordinates given by
the hardware can be sub-pixel, too… Especially in my case =P The
Nintendo DS’ registers for touch coordinates as presented by libnds give
us two sets of (x,y) coordinates: one that’s in the range of a 256x192
rectangle, and one that’s in the range of a $1000x$1000 rectangle/square
(hex). Both of these, clearly, might be important for different
developers’ needs; one using the touch points in terms of pixels on a
user interface, versus one using the touch motion as a means of
controlling a ball’s acceleration in a video game.

Of course, the “sub-pixel” resolution is, too, often in even numbers.
But of course me saying that the point (0,0) is the center of
[-32768,32767]x[-32768,32767] is wrong to begin with; the actual center
would be (-.5, -.5). But those aren’t integers, obviously. So (0,0) as
the “center” would actually be the "bottom-right pixel of the center."
Does seem kind of arbitrary, as you pointed out.

Not that it would change terribly much by just adding 32768 to all the
numbers involved; it would just make that “center-ish” point be
(32768,32768) instead. Of course, the “center” of all these coordinates
isn’t all too important; it’s just like asking what the center of an
even-dimensioned video screen is – sub-pixel, obviously. I pretty much
said that (0,0) was the center as a convenient way of explaining the
system.

I guess the code and arithmetic used by the developer would be easier to
deal with if (0,0) were at the top-left, as it is with the video screen
itself, so I’ll probably go ahead and make that change. Thanks for your
input =)–
Darren
http://lifning.americankryptonite.net/blag/

On Fri, 2008-06-27 at 09:50 -0400, Jonathan Dearborn wrote:

Is there a specific reason that the center of the screen is the origin (0, 0)? It doesn’t seem right to have to convert your touch coords to app coords using the screen size when the origin could just be the top left. Another thing to note is the ‘center’ of the screen is subpixel since most resolutions are even numbers. This imposes an arbitrary decision of which of the four center pixels represents (0, 0).

Jonny D

Hey Darren,

I don’t believe the DS supports multi touch. I could be wrong but at the
level that I have worked with it, when you touch it in two places you seem
to get the average (kind of…it’s weird and seemed unreliable).

I saw a homebrew ds paint program once that supported pressure sensitivity
once, the standard API stuff doesn’t expose that ability but there must be a
way of getting that data if you are interested in putting it in!

And lastly, the weird thing about having the DS generate mouse events is
that you can’t have a “mouse position” without a “mouse drag” event
happening if you know what I mean.

If you are touching the screen, that’s like the mouse button is down but
it’s also the only way to have a mouse position.

Maybe this is something you are already considering in your events but just
wanted to point that out just in case.

My 2 cents in case any of it helps (:> ----- Original Message -----

From: sdl-bounces@lists.libsdl.org [mailto:sdl-bounces at lists.libsdl.org] On
Behalf Of Darren Alton
Sent: Friday, June 27, 2008 3:43 AM
To: A list for developers using the SDL library. (includes SDL-announce)
Subject: [SDL] GSoC 2008 - SDL 1.3 - Nintendo DS

A brief summary of what’s been done on the gsoc2008_nds branch thus far:

After a wild goose chase searching for a mysterious bug in the graphics
driver, a simple 2D software renderer has been added using the SDL 1.3
graphics driver paradigm. This driver uses the Nintendo DS framebuffer
and is largely unoptimized; its main use for now is to provide some kind
of functional video output while other features are added. From the
looks of it, the design of SDL 1.3 video drivers seems to be consistent
with some of the ideas I’ve had for implementing a DS-optimized driver,
and should make it easier than it would’ve been to make such an
optimized driver for SDL 1.2.

Other future plans regarding work on the NDS video driver include some
way for the OpenGL-ish 3D interface provided by libnds to be able to be
used in tandem with SDL; this probably means that I should implement
another driver that uses said GL-ish interface to do the actual
rendering, as the existing OpenGL drivers for SDL already do. (Perhaps
I could even simply add some code to those existing GL drivers instead!)

Beyond video, I’ve duplicated the joystick functionality from the 1.2
port with a few minor changes/fixes to work well and as one would
expect. From there, I used the existing joystick implementation as a
skeleton for a touchscreen API, so that it would be consistent with the
joystick API in terms of use. It’s still rough around the edges and I’m
currently working on refining the way it reports events – anyone who’s
seen the footage of my touch demo (and who has very good eyes to see
through the poor quality) might be able to tell that the SDL event loop
I used didn’t always catch the “down” and “released” actions.

In designing the touch API, I tried to keep multi-touch in mind. I
don’t have any multi-touch capable devices, so I have no real means of
testing my designs, but nonetheless I have the code able to allow for an
arbitrary amount of touches, given any maximum number of touches. The
coordinates given by the touch API for each touch point are
(x,y,pressure), where x and y are signed integers between -32768 and
32767, and (0,0) is the center of the screen; pressure is an unsigned
integer between 1 and 65535 when the screen is being touched and 0 when
it is not being touched. This should hopefully provide sufficient
precision when larger and more precise touchscreens are available, so
long as the machines can handle 16-bit integer arithmetic.

I also plan to have some sort of flag that enables “mouse emulation” by
the SDL touchscreen API, where it would report mouse events rather than
touchscreen events. This should allow for better compatibility with
legacy programs. Thoughts on this?

Also, here are a couple of (poor quality, sorry!) videos showing some
test programs running on the hardware.
http://youtube.com/watch?v=SBYHTfK9Wec
http://youtube.com/watch?v=7zxyNtDpkqE


Darren
http://lifning.americankryptonite.net/blag/


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hey Darren,

I don’t believe the DS supports multi touch. I could be wrong but at the
level that I have worked with it, when you touch it in two places you seem
to get the average (kind of…it’s weird and seemed unreliable).

I saw a homebrew ds paint program once that supported pressure sensitivity
once, the standard API stuff doesn’t expose that ability but there must be a
way of getting that data if you are interested in putting it in!

Oh, no! Of course the DS doesn’t have multi-touch, but we want SDL to
support devices that do have multi-touch with its API, right? Sorry
for the confusion there =) It can be implemented just fine for
single-touch devices like the DS, as I have done. It just only ever
reports one touch point.

And lastly, the weird thing about having the DS generate mouse events is
that you can’t have a “mouse position” without a “mouse drag” event
happening if you know what I mean.

If you are touching the screen, that’s like the mouse button is down but
it’s also the only way to have a mouse position.

Maybe this is something you are already considering in your events but just
wanted to point that out just in case.

My 2 cents in case any of it helps (:

Yeah, touchscreen mouse emulation is odd like that, there can’t be a
change in coordinates without a click. Perhaps I’ll adopt the kind of
"double-tap" thing some laptop touchpads do… This is more of an extra
feature than anything, anyhow, for quick support for legacy
applications.

Thanks for your input!On Fri, 2008-06-27 at 23:34 -0700, Alan Wolfe wrote:


Darren
http://lifning.americankryptonite.net/blag/

Perhaps I’ll adopt the kind of
"double-tap" thing some laptop touchpads do…

please dont, that is so annoying.

matt

Hahaha, I know… I’ve considered other options, too, like letting the
L and R buttons on the DS represent left and right mouse buttons, but
that would effectively sacrifice them as joystick buttons. Also, that
might not translate well to other platforms with a touchscreen but
without any buttons. So as much as I hate that damn double-tap to click
paradigm we see on laptops, I don’t know what other options I have,
here… Single-tap would arguably be worse =P–
Darren
http://lifning.americankryptonite.net/blag/

On Sat, 2008-06-28 at 01:58 -0500, mattmatteh at mac.com wrote:

Perhaps I’ll adopt the kind of
"double-tap" thing some laptop touchpads do…

please dont, that is so annoying.

matt


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hey Darren,

In designing the touch API, I tried to keep multi-touch in mind. I
don’t have any multi-touch capable devices, so I have no real means of
testing my designs, but nonetheless I have the code able to allow
for an
arbitrary amount of touches, given any maximum number of touches. The
coordinates given by the touch API for each touch point are
(x,y,pressure), where x and y are signed integers between -32768 and
32767, and (0,0) is the center of the screen; pressure is an unsigned
integer between 1 and 65535 when the screen is being touched and 0
when
it is not being touched. This should hopefully provide sufficient
precision when larger and more precise touchscreens are available, so
long as the machines can handle 16-bit integer arithmetic.

Definitely keep things in terms of SDL screen coordinates (in
pixels). I would say to use floating point instead of larger numbers.

Also, it’d be good to have a function that tells you if your device is
pressure sensitive. The iPhone’s screen is not (at least to the
programmer) pressure sensitive.

I also plan to have some sort of flag that enables "mouse emulation"
by
the SDL touchscreen API, where it would report mouse events rather
than
touchscreen events. This should allow for better compatibility with
legacy programs. Thoughts on this?

A mouse emulation flag would definitely be good. For lack of a Touch
API, I currently have Touch events emulated as multiple mice. With
the Touch API fully implemented, of course, you’d want to limit the
emulation to a single mouse, and ignore touches that begin while one
is already active.

In another one of your messages you mentioned the problem of only
having coordinates when a touch is “down” (ie, finger/stylus is on the
screen), and the problem of generating mouseDown events when you just
want to move your emulated mouse. You also mentioned some possible
solutions. Here is what I’m thinking we could do to solve these
problems:

I’m thinking we could have two different mouse emulations modes.

Mode 1 would simulate the mouse in every respect possible, and would
be for the most stringent compatibility cases. In this mode, a
virtual cursor would even be displayed for where the mouse is located
on the screen. Tapping or dragging anywhere on the screen causes the
cursor to move to this position. Tapping the cursor itself emulates a
click, perhaps double-tapping could emulate a right-click.

Mode 2 would be less strict emulation. In this mode, no cursor would
be displayed. Starting a touch would generate a mouseDown event, and
ending one would generate a mouseUp event. Dragging a touch would
cause a mouseMoved event. It would be more elegant than Mode 1, but
it suffers from not having a definite location for the mouse when no
touches are active, and causes “clicking” type events when you may
have wished only the move the mouse around. Mode 2 is basically what
I have on iPhone right now, minus the multi-touch.

Tell me what you think.

  • Holmes

When you multitouch a DS, I don’t think you get the average. It’s more like a preference for one finger, but you can use it to cheat in certain games! If you put a finger down then start tapping another finger elsewhere, you get super-fast drag movement. This can be used in Mario 64 DS minigames for hitting shells or clearing the screen of shadow. I can’t tell if that’s a hardware effect, though. Could Darren tell us?

Jonny D----------------------------------------

From: Atrix2 at cox.net
To: sdl at lists.libsdl.org
Date: Fri, 27 Jun 2008 23:34:42 -0700
Subject: Re: [SDL] GSoC 2008 - SDL 1.3 - Nintendo DS

Hey Darren,

I don’t believe the DS supports multi touch. I could be wrong but at the
level that I have worked with it, when you touch it in two places you seem
to get the average (kind of…it’s weird and seemed unreliable).

That’s the hardware, as far as I can tell. It effectively gives an
average of all the touches, weighted by their pressure. So if you have
both of your fingers on opposite sides of the screen pushing equally
hard, the point will be guessed as the center of the screen. If you
press harder on the right side, the point will creep further to the
right. If you let go of one finger, it will register the touch point as
moving quickly to the other finger.–
Darren
http://lifning.americankryptonite.net/blag/

On Sat, 2008-06-28 at 09:18 -0400, Jonathan Dearborn wrote:

When you multitouch a DS, I don’t think you get the average. It’s more like a preference for one finger, but you can use it to cheat in certain games! If you put a finger down then start tapping another finger elsewhere, you get super-fast drag movement. This can be used in Mario 64 DS minigames for hitting shells or clearing the screen of shadow. I can’t tell if that’s a hardware effect, though. Could Darren tell us?

Jonny D

Holmes Futrell <hfutrell umail.ucsb.edu> writes:

Definitely keep things in terms of SDL screen coordinates (in
pixels). I would say to use floating point instead of larger numbers.

Also, it’d be good to have a function that tells you if your device is
pressure sensitive. The iPhone’s screen is not (at least to the
programmer) pressure sensitive.

Using a double or a float, with a range 0 - 1.0, could be a good way of
representing pressure values. On a system that does not have pressure values the
result would be 0 or 1 and on a system that does have pressure values, then it
would be a percentage from no pressure to maximum pressure.

Andre

Holmes Futrell <hfutrell umail.ucsb.edu> writes:

Definitely keep things in terms of SDL screen coordinates (in
pixels). I would say to use floating point instead of larger numbers.

Also, it’d be good to have a function that tells you if your device is
pressure sensitive. The iPhone’s screen is not (at least to the
programmer) pressure sensitive.

Using a double or a float, with a range 0 - 1.0, could be a good way of
representing pressure values. On a system that does not have pressure values the
result would be 0 or 1 and on a system that does have pressure values, then it
would be a percentage from no pressure to maximum pressure.

Andre

i prefer uint8 or uint16. or what ever the system is. i think messing
with floats is more work.

matt

Agreed. Many systems lack FPUs (thinking back to the ARM- and MIPS-based
Linux handhelds I’ve played with, and to BREW and J2ME mobile development).On Sat, Jul 05, 2008 at 03:02:54PM -0500, mattmatteh at mac.com wrote:

i prefer uint8 or uint16. or what ever the system is. i think messing
with floats is more work.


-bill!
“Tux Paint” - free children’s drawing software for Windows / Mac OS X / Linux!
Download it today! http://www.tuxpaint.org/

i prefer uint8 or uint16. or what ever the system is. i think
messing
with floats is more work.

Agreed. Many systems lack FPUs (thinking back to the ARM- and
MIPS-based
Linux handhelds I’ve played with, and to BREW and J2ME mobile
development).

As well as the DS, which is also ARM-based =)On Sat, 2008-07-05 at 23:33 -0700, Bill Kendrick wrote:

On Sat, Jul 05, 2008 at 03:02:54PM -0500, mattmatteh at mac.com wrote:


Darren
http://lifning.americankryptonite.net/blag/

Hey,

I’m trying to build the blitter separately from the rest of SDL so I can try to add functionality to it, but I’m having trouble compiling and linking the hermes asm code. I’m using CodeBlocks with Mingw/GCC on WinXP. When I build my project with the asm files included, Mingw doesn’t output the object files…

I get this compilation error:
mingw32-g++.exe: mmx_main.asm: linker input file unused because linking not done

And, of course, this linking error:
mingw32-g++.exe: .objs\mmx_main.o: No such file or directory

How should I be compiling and linking asm code?

Jonny D

Hello !

I’m trying to build the blitter separately from the rest of SDL so I
can try to add functionality to it, but I’m having trouble
compiling and linking the hermes asm code. I’m using CodeBlocks
with Mingw/GCC on WinXP. When I build my project with the asm
files included, Mingw doesn’t output the object files…

I get this compilation error:
mingw32-g++.exe: mmx_main.asm: linker input file unused because
linking not done

And, of course, this linking error:
mingw32-g++.exe: .objs\mmx_main.o: No such file or directory

I think you need nasm or yasm build object files
from the ASM files and then link the created object files
with MinGW.

CU

Thanks Torsten,

yasm works great. It all built and linked easily.
In case this helps anyone, I built my objects with this:
yasm -f elf -o myfile.o myfile.asm
Then linked with g++.

Jonny D> Date: Sun, 6 Jul 2008 20:46:18 +0000

From: wizard at syntheticsw.com
To: sdl at lists.libsdl.org
Subject: Re: [SDL] Linking blitter asm

Hello !

I’m trying to build the blitter separately from the rest of SDL so I
can try to add functionality to it, but I’m having trouble
compiling and linking the hermes asm code. I’m using CodeBlocks
with Mingw/GCC on WinXP. When I build my project with the asm
files included, Mingw doesn’t output the object files…

I get this compilation error:
mingw32-g++.exe: mmx_main.asm: linker input file unused because
linking not done

And, of course, this linking error:
mingw32-g++.exe: .objs\mmx_main.o: No such file or directory

I think you need nasm or yasm build object files
from the ASM files and then link the created object files
with MinGW.

CU


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hello !

yasm works great. It all built and linked easily.
In case this helps anyone, I built my objects with this:
yasm -f elf -o myfile.o myfile.asm
Then linked with g++.

You can also use MSYS and MinGW
to have SDL easily build using the configure stuff.

CU

Darren Alton wrote:

A brief summary of what’s been done on the gsoc2008_nds branch thus far:

Heya!
Just wondering if you have any updates for us? I’ve merely got a passing
interest, nothing hard and fast at the moment. Curiosity has bitten me,
though - so much so I went to the URL in your sig block thinking you may
have something there but the domain name fails to resolve for me :frowning:

Cheers!

Pete.

P.S. If this comes through as a double post, my apologies. I forgot to
set the correct From: address, and subsequently my original message is
pending approval. There’s always the chance that approval will happen
before I get this duplicate out, or that the original wrong mail won’t
ever get approved :wink:

Hi there! Yes, my blog is dead; my friend’s web hosting plan expired
just the other day.

I just moved up to school for the semester, so things have been kinda
hectic, but things are finally starting to settle down and I’ll have
some interesting updates soon. =)

As far as the current status goes, video driver is partially working,
pending some more work on the sprite-sized texture handling. Once that
work is done, I’ve been being very careful to organize my new driver
code such that adding support for using both screens should be trivial
when the time comes.

Joystick still works (well, it should), timers work now, and audio
should work soon (just SDL’s built-in audio for now, not SDL_mixer.
that’ll come after built-in SDL audio I guess.)

And now that the DS code is in trunk, as well as the new manymouse API
work), I can get to adding proper touchscreen support (rather than
the proof-of-concept I had before)

Thanks for your curiosity! =)

-DarrenOn Tue, 2008-09-02 at 08:10 +1000, Peter Lawler wrote:

Darren Alton wrote:

A brief summary of what’s been done on the gsoc2008_nds branch thus
far:

Heya!
Just wondering if you have any updates for us? I’ve merely got a
passing
interest, nothing hard and fast at the moment. Curiosity has bitten
me,
though - so much so I went to the URL in your sig block thinking you
may
have something there but the domain name fails to resolve for me :frowning:

Cheers!