SDL 1.2 Text Input

Hello Everyone!

I’ve been working on a custom GUI for my projects and I need to be able to
allow
user text input. I’ve been having some issues with getting this to work
correctly.

The most common solution is to enable Unicode and to cast
event.key.keysym.unicode as a char and add it to the string (char *).
That’s
all well and good if I want to use unicode.

If I do not, the only solution I found was to use iscntrl on the sym. If it
isn’t a control, then cast the sym as a char and add it.

However, when running some tests on the latest ubuntu I was getting odd
results
when using the keypad. Some where controls, some where special characters
(showed up as blocks)

What would be the best, most crossplatform compliant method of converting
keysyms to ANSII chars?

Would a large switch, enable unicode, or the use of iscntrl, or some other
method be best?

Performance is important, but stability and correct character reporting
would be
most important.

Thanks in advance!

  • Micah Brening

Sorry if I’m being extremely noob-ish, but when would you want to use
something other than Unicode? I assume that by ANSII you mean ASCII.On Mon, Jul 6, 2009 at 2:54 PM, Micah Brening <micah.brening at gmail.com>wrote:

Hello Everyone!

I’ve been working on a custom GUI for my projects and I need to be able to
allow
user text input. I’ve been having some issues with getting this to work
correctly.

The most common solution is to enable Unicode and to cast
event.key.keysym.unicode as a char and add it to the string (char *).
That’s
all well and good if I want to use unicode.

If I do not, the only solution I found was to use iscntrl on the sym. If
it
isn’t a control, then cast the sym as a char and add it.

However, when running some tests on the latest ubuntu I was getting odd
results
when using the keypad. Some where controls, some where special characters
(showed up as blocks)

What would be the best, most crossplatform compliant method of converting
keysyms to ANSII chars?

Would a large switch, enable unicode, or the use of iscntrl, or some other
method be best?

Performance is important, but stability and correct character reporting
would be
most important.

Thanks in advance!

  • Micah Brening

SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Jordy D <praesentium gmail.com> writes:

Sorry if I’m being extremely noob-ish, but when would you want to use
something other than Unicode? I assume that by ANSII you mean ASCII.

Yes, you are right, I meant ASCII. I think XCKD mentioned typing something you
are saying and it all getting jumbled.

Why would I want to use Unicode for simple keyboard input? For non english
users? Well let me tell you what they can do… Joking!!!

I guess I could enable unicode while needing text input, and then disable it
during a game, correct?

But is unicode the best way to go?

But is unicode the best way to go?

yes, it is (I am also using SDL for my Win95 style GUI, btw)

Micah Brening schrieb:> Jordy D <praesentium gmail.com> writes:

Sorry if I’m being extremely noob-ish, but when would you want to use

something other than Unicode? I assume that by ANSII you mean ASCII.

Yes, you are right, I meant ASCII. I think XCKD mentioned typing something you
are saying and it all getting jumbled.

Why would I want to use Unicode for simple keyboard input? For non english
users? Well let me tell you what they can do… Joking!!!

I guess I could enable unicode while needing text input, and then disable it
during a game, correct?

But is unicode the best way to go?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Bastian Spiegel <bs tkscript.de> writes:

But is unicode the best way to go?

yes, it is (I am also using SDL for my Win95 style GUI, btw)

Alright, so enable unicode, cast the keysym.unicode field to a char and run with
it? And this will be accurate and stable?

Or would it be better to rather than use a char * use something more unicode
compliant? If so, are there any pointers as to how to do this? (names,
tutorials, etc?)

Or would it be better to rather than use a char * use something more unicode
compliant? ?If so, are there any pointers as to how to do this? (names,
tutorials, etc?)

It’s UTF-8, which fits in a char*. There’s some subtleties with
strlen(), as it counts the number of bytes rather than the number of
characters (UTF-8 uses “continuation bytes” to encode non-ASCII
characters). Note also that the number of characters is usually not
what you want when using strlen() anyway, but rather how wide the
string is. For example, ideographic characters, like Chinese, are
double-width, and combining characters used for accents do not advance
the cursor (for a “?”, you could have a “non advancing ?” combining
character, followed by a normal “c”, yielding a “?” when displayed).
See the following for more information:

http://www.cl.cam.ac.uk/~mgk25/unicode.html

In any case, with most Western scripts, your application will work
more or less correctly if you do nothing special, with some text
alignment bugs at worst.

Also, remember that a single keysym.unicode can be more than one byte,
when you append it to your buffer, don’t just take the first byte, you
have to do a strcpy() or whatever equivalent is appropriate for your
code.On Mon, Jul 6, 2009 at 4:04 PM, Micah Brening<micah.brening at gmail.com> wrote:


Or would it be better to rather than use a char * use something more unicode
compliant? If so, are there any pointers as to how to do this? (names,
tutorials, etc?)

I guess I should just say to start: text input is a mess in SDL 1.2.
We’re fixing it for 1.3, but if all you care about is US English, then
it’s good enough.

Enable Unicode. Even if you don’t care about Unicode, enable it at start
up. This won’t prevent you from getting normal keypress events, so you
don’t need to toggle it on and off. Just SDL_EnableUNICODE(1) once,
shortly after SDL_Init().

Then, when you care about text input, check the “unicode” field for
keydown events. If it’s not a value between 1 and 255, ignore it (which
means some keystrokes for foreign languages will be silently lost, but
what can you do?)…Otherwise, cast it to (char) and add it to your
string. You’ll have to handle things like backspace explicitly, of course.

When you care about game input (the keyboard is a 101-key gamepad, not a
text entry device), use the keysym fields, not the unicode one. Unicode
is only reliable on keydown, but the keysyms will work for key presses
and releases, so you should use them for gameplay.

Fairly uncomplicated Unicode (many European languages) will probably
work, too. The values above 128 will map to Latin1 (I think), so you can
still catch the English-looking things that have accents, etc. Above
character 255, things get complicated.

Complicated Unicode (the rest of the world) are basically a loss in SDL
1.2. Some of it works, but things that need Input Method magic are going
to be broken (so entering many characters you’d find around Asia are
probably not possible).

We have Smart People fixing this for 1.3. But for the needs you’re
describing, this should get you going.

–ryan.

Ryan C. Gordon <icculus icculus.org> writes:

Enable Unicode. Even if you don’t care about Unicode, enable it at start
up. This won’t prevent you from getting normal keypress events, so you
don’t need to toggle it on and off. Just SDL_EnableUNICODE(1) once,
shortly after SDL_Init().

I think I got it. Doesn’t look as though it’d be as easy as I had hoped. But
it gives me a good place to start.

Thanks!

The only question I have left, if I only need text input during a non time
critical moment (save dialog, or game setup) would it be advisable to enable
when allocating the GUI and then disable when starting or returning to a game
state? Or is that too expensive?

When you care about game input (the keyboard is a 101-key gamepad, not a
text entry device), use the keysym fields, not the unicode one. Unicode is
only reliable on keydown, but the keysyms will work for key presses and
releases, so you should use them for gameplay.

Right on. A mistake I was a part of, uh, a while ago, was to confuse
"text entry" and “pressing/releasing keys”, Ryan’s advice avoids this.

Fairly uncomplicated Unicode (many European languages) will probably work,
too. The values above 128 will map to Latin1 (I think), so you can still
catch the English-looking things that have accents, etc. Above character
255, things get complicated.

Yep, values between 128 and 255 are also non-ASCII, but they do share
the same mapping as ISO-8859-1/Latin-1 (basically ASCII plus accents
for Western scripts like French, Spanish, Portuguese, Icelandic, etc),
if that’s what you happen to be using, it’ll work fine.

Do not confuse with Windows-1252, which is what Windows is using (ah,
they love to be different in Windows-land, don’t they?).On Mon, Jul 6, 2009 at 4:29 PM, Ryan C. Gordon wrote:


you could use wchar_t. On Windows, this is 16 bits, elsewhere it’s usually
32. Either way, it should handle any Unicode value from SDL just fine.On Monday, 6 July 2009 16:04:13 Micah Brening wrote:

Bastian Spiegel <bs tkscript.de> writes:

But is unicode the best way to go?

yes, it is (I am also using SDL for my Win95 style GUI, btw)

Alright, so enable unicode, cast the keysym.unicode field to a char and run
with it? And this will be accurate and stable?

Or would it be better to rather than use a char * use something more
unicode compliant? If so, are there any pointers as to how to do this?
(names, tutorials, etc?)