I found that events weren’t being generated for the simple directional swipes on the Apple TV touch remote.
While it’s possible to use the touch press/release/motion events to detect simple swipes of this nature (and I had this working well enough) it struck me that this was re-inventing the wheel.
As such, I have written a patch that recognises the proper swipe up/down/left/right events issued in UIKit and passes these on as directional key presses. This also means that the touch remote operates the same way as the old-style IR remote in this respect.
I found that events weren’t being generated for the simple directional swipes on the Apple TV touch remote.
While it’s possible to use the touch press/release/motion events to detect simple swipes of this nature (and I had this working well enough) it struck me that this was re-inventing the wheel.
As such, I have written a patch that recognises the proper swipe up/down/left/right events issued in UIKit and passes these on as directional key presses. This also means that the touch remote operates the same way as the old-style IR remote in this respect.
Code:
diff -r 007dfe83abf8 src/video/uikit/SDL_uikitview.m
— a/src/video/uikit/SDL_uikitview.m Wed Oct 19 20:50:33 2016 -0700
+++ b/src/video/uikit/SDL_uikitview.m Sun Dec 18 14:57:42 2016 +0300
@@ -48,7 +48,20 @@ #if !TARGET_OS_TV
self.multipleTouchEnabled = YES; #endif
I just tested on the tvOS Simulator and the key press events work for me with the latest SDL code from hg.> On Dec 18, 2016, at 10:45 AM, Alex Szpakowski <@Alex_Szpakowski> wrote:
The ?tap? actions map pretty cleanly to button presses, in terms of what their behaviour describes. SDL doesn?t really have analogous user-facing APIs that describe OS-handled swipe gestures in the same manner, however you can access the Remote?s touchpad as joystick axes in order to get a more granular description of what the user is doing with it.> On Dec 18, 2016, at 11:55 AM, oviano wrote:
Ok I’ll try the simulator later but you’re testing the swipe gestures, correct?
The up/down/left/right swipe gestures on Apple TVs are the primary method to navigate around button/menu systems on the device so assuming that some SDL apps will have button or menu systems that require the user to navigate from one button to another then it’s far better to hook into these gestures…
Otherwise you are expecting the user to re-implement these themselves and inevitably there will be subtle differences in “feel” compared to the official ones leading to a substandard user experience.
Given the fact that key presses can?t represent swipe strength which is one of the main purposes of a swipe, and the fact that the direction key presses are already used by the tap gestures, how do you propose swipes should be exposed in SDL?s APIs?> On Dec 18, 2016, at 12:32 PM, oviano wrote:
The up/down/left/right swipe gestures on Apple TVs are the primary method to navigate around button/menu systems on the device so assuming that some SDL apps will have button or menu systems that require the user to navigate from one button to another then it’s far better to hook into these gestures…
Otherwise you are expecting the user to re-implement these themselves and inevitably there will be subtle differences in “feel” compared to the official ones leading to a substandard user experience.
Well the swipe gesture I’m talking about hooking into, and which my patch uses is just a simple standard swipe with direction only. There is concept of strength:
So it maps pretty easily to the arrow key presses.
It’s funny - I actually had no idea (despite owning a touch-based Apple TV since they were available) that you could tap to produce the direction “buttons”, I’d always swiped which is why it seems to me such a big omission. I simply had no way to navigate within the UI of my application (a video player)
Thanks for making the change - I haven’t tested your slight variation on my code yet but I will at some point and let you know if I encounter any problems.