Handling input with unicode does not work

Hi, I am having trouble accepting input from the user. This is the handle function I call from the main loop:

Code:
void UserInput::handle_input()
{

//If a key was pressed 
if(g_event.type == SDL_KEYDOWN ) 
{ 
	//Keep a copy of the current version of the string 
	std::string temp = str; 
	//If the string less than maximum size 
	if( str.length() <= 16 ) 
	{
		//If the key is a space 
		if( g_event.key.keysym == (Uint16)' ' ) 
		{ 
			//Append the character 
			str += (char)g_event.key.keysym.unicode; 
		}
		//If the key is a number 
		else if( ( g_event.key.keysym.unicode >= (Uint16)'0' ) && ( g_event.key.keysym.unicode <= (Uint16)'9' ) ) 
		{ 
			//Append the character 
			str += (char)g_event.key.keysym.unicode; 
		} //If the key is a uppercase letter 
		else if( ( g_event.key.keysym.unicode >= (Uint16)'A' ) && ( g_event.key.keysym.unicode <= (Uint16)'Z' ) ) 
		{ 
			//Append the character 
			str += (char)g_event.key.keysym.unicode; 
		} 
		//If the key is a lowercase letter 
		else if( ( g_event.key.keysym.unicode >= (Uint16)'a' ) && ( g_event.key.keysym.unicode <= (Uint16)'z' ) ) 
		{ 
			//Append the character 
			str += (char)g_event.key.keysym.unicode; 
		} 
	}
	//If backspace was pressed and the string isn't blank 
	if( ( g_event.key.keysym.sym == SDLK_BACKSPACE ) && ( str.length() != 0 ) ) 
	{ 
		//Remove a character from the end 
		str.erase( str.length() - 1 ); 
	}
	//If the string was changed 
	if( str != temp ) 
	{ 
		//Free the old surface 
		SDL_FreeSurface( text ); 
		//Render a new text surface 
		text = TTF_RenderText_Solid( font, str.c_str(), textColor ); 
	} 
} 

}

While debugging, I found that when I press a keydown, it DOES enter into the block of code. But if I set a breakpoint inside any of the other if statements that use Unicode, the condition never seems to be true. I took this straight off the tutorial from lazyfoo.net. Thanks for the help.

Hey, Akill! Getting homework help?

As for helping you:

You aren’t using the unicode value for the string (you’re casting to a char) or even accepting unicode input (you’re only accepting spaces and alphanumeric characters, these are available in the ASCII character set already) so why bother handling unicode input to begin with?------------------------
EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/

Akill10 wrote:

Hey, ye I needed a bit of help :stuck_out_tongue_winking_eye:

Basically, I followed these instructions and took them to be true:

Because the SDLKey definitions don’t match up with their ASCII/Unicode values, we enable unicode so the “unicode” member of the Keysym structure matches the unicode value of character pressed. Enabling unicode also automatically handles shift and caps lock when you want capital letters and symbols.

If you don’t know what unicode is, it’s just an extension of ASCII. Instead of being 8bit, it’s 16bit so it can hold all the international characters.

Here if the key pressed has the unicode value of the space character, we append it the string. Since a standard string only uses 8bit ASCII characters, we have to convert it to a char when appending it.

  1. SDLKey definitions match up to the system scancode, i believe, which may indeed be different than the ASCII value. So when checking for them, you will need to compare with the SDLKey value, not the ASCII value.
  2. You then need to append the ascii value to the string.
  3. There’s no need for unicode here. LazyFoo was a bit misguided when he wrote that, methinks.------------------------
    EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/

I tried creating a new project just to test this. It works perfectly, any ideas what I can do?

Hey, ye I needed a bit of help :stuck_out_tongue_winking_eye:

Basically, I followed these instructions and took them to be true:>

Because the SDLKey definitions don’t match up with their ASCII/Unicode values, we enable unicode so the “unicode” member of the Keysym structure matches the unicode value of character pressed. Enabling unicode also automatically handles shift and caps lock when you want capital letters and symbols.

If you don’t know what unicode is, it’s just an extension of ASCII. Instead of being 8bit, it’s 16bit so it can hold all the international characters.

Here if the key pressed has the unicode value of the space character, we append it the string. Since a standard string only uses 8bit ASCII characters, we have to convert it to a char when appending it.

I don’t see a problem with using the unicode field here, especially since
you want normal text input with caps and all that. You can simplify your
testing a lot, though. The clue is that you’ve duplicated essentially the
same code a few times.

Try something like:
Uint16 key = g_event.key.keysym.unicode;
if(32 <= key && key <= 126)
str += key;

You might replace 32 with ’ ’ (space) and 126 with ‘~’ (tilde). These
characters bound the printable ASCII range.

See ya,
Jonny DOn Wed, Apr 20, 2011 at 2:36 PM, Nathaniel J Fries wrote:

Akill10 wrote:

Hey, ye I needed a bit of help :stuck_out_tongue_winking_eye:

Basically, I followed these instructions and took them to be true:

Quote:

Because the SDLKey definitions don’t match up with their ASCII/Unicode
values, we enable unicode so the “unicode” member of the Keysym structure
matches the unicode value of character pressed. Enabling unicode also
automatically handles shift and caps lock when you want capital letters and
symbols.

If you don’t know what unicode is, it’s just an extension of ASCII. Instead
of being 8bit, it’s 16bit so it can hold all the international characters.

Here if the key pressed has the unicode value of the space character, we
append it the string. Since a standard string only uses 8bit ASCII
characters, we have to convert it to a char when appending it.

  1. SDLKey definitions match up to the system scancode, i believe, which may
    indeed be different than the ASCII value. So when checking for them, you
    will need to compare with the SDLKey value, not the ASCII value.
  2. You then need to append the ascii value to the string.
  3. There’s no need for unicode here. LazyFoo was a bit misguided when he
    wrote that, methinks.

EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

In my experience, the unicode field is no longer reliable in SDL 1.3.
You should listen for SDL_TEXTINPUT events instead.On 4/27/2011 07:30, Jonathan Dearborn wrote:

I don’t see a problem with using the unicode field here, especially since
you want normal text input with caps and all that. You can simplify your
testing a lot, though. The clue is that you’ve duplicated essentially the
same code a few times.


Rainer Deyke - rainerd at eldwood.com

Jonny D wrote:

I don’t see a problem with using the unicode field here, especially since you want normal text input with caps and all that. ?You can simplify your testing a lot, though. ?The clue is that you’ve duplicated essentially the same code a few times.

Try something like:
Uint16 key =?g_event.key.keysym.unicode;
if(32 <= key && key?<= 126)
?? ?str +=?key;

You might replace 32 with ’ ’ (space) and 126 with ‘~’ (tilde). ?These characters bound the printable ASCII range.

See ya,
Jonny D

Oh, there’s no problem with it, it’s just that it’s completely unnecessary. He filters out all non-ASCII and also only stores ASCII.------------------------
EM3 Nathaniel Fries, U.S. Navy

http://natefries.net/