[PATCH] Update android port to build with current head

There were some api changes that hadn’t been updated.

Meanwhile, is this port alive or considered dead?

lu–

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero

-------------- next part --------------
A non-text attachment was scrubbed…
Name: sdl-android-update.diff
Type: text/x-patch
Size: 1436 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20101015/ec137176/attachment.bin

There were some api changes that hadn’t been updated.

Meanwhile, is this port alive or considered dead? The audio subsystem is
still sketched at most and the at least to me the video subsystem isn’t
producing images while other ports are at least showing images and badly
output sound.

lu–

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero

-------------- next part --------------
A non-text attachment was scrubbed…
Name: sdl-android-update.diff
Type: text/x-patch
Size: 1437 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20101015/dcaf5d6d/attachment.bin

http://libsdl-android.sourceforge.net/ - more stable and tested Android
libSDL port.

Again, sorry for shameless advertising.On Fri, Oct 15, 2010 at 8:23 PM, Luca Barbato <lu_zero at gentoo.org> wrote:

There were some api changes that hadn’t been updated.

Meanwhile, is this port alive or considered dead? The audio subsystem is
still sketched at most and the at least to me the video subsystem isn’t
producing images while other ports are at least showing images and badly
output sound.

lu

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero http://dev.gentoo.org/~lu_zero


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

I used your port already, while it is working, it does have the
following shortcomings:

  • heavily tangled (so it’s hard to retarget to other uses)
  • high overhead (my ffplay quick port gets swamped on the YUV Unlock and
    Display for yet to be tracked reasons)
  • the audio subsystem is existent but it is another source of severe
    slowdowns.

I’m getting back to the main libsdl looking for ideas about this
situation and while I was at it I thought it might be nice send back at
least this minor fix.

luOn 10/15/2010 07:45 PM, Sergiy Pylypenko wrote:

http://libsdl-android.sourceforge.net/ - more stable and tested Android
libSDL port.

Again, sorry for shameless advertising.

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero

Hi

I’d be interested in seeing your ffplay port, I am a major contributor to ffmpeg4iphone, and we made a lot of ffplay optimizations that you might be able to port to android.

As for these sdl ports, I have to agree with you, and I am a little shocked that Sam would even consider the scoc2010 port to be anywhere near complete.

Its barely usable, the only code that seems to work is the lesson05.c sample included, and that is not very portable.

The minimal test cases from the other projects, build but fail to produce anything usable.

As for the other port, well, if you build game that essentially clone the same logic as alienbaster with variation than the author is correct it works, but the code is so unique that it is not potable.

Again it is pretty much not possible to build any of the testcases.

And to port any of the demos from the simplest to the complex would be impossible.

to be useful as a reusable port, the port should prove the ability to build all the sdl testcases, the graywin.c sample, and the simple sound samples, . Then it would be worth looking at.

Luca Barbato wrote:> On 10/15/2010 07:45 PM, Sergiy Pylypenko wrote:

http://libsdl-android.sourceforge.net/ - more stable and tested Android
libSDL port.

Again, sorry for shameless advertising.

I used your port already, while it is working, it does have the
following shortcomings:

  • heavily tangled (so it’s hard to retarget to other uses)
  • high overhead (my ffplay quick port gets swamped on the YUV Unlock and
    Display for yet to be tracked reasons)
  • the audio subsystem is existent but it is another source of severe
    slowdowns.

I’m getting back to the main libsdl looking for ideas about this
situation and while I was at it I thought it might be nice send back at
least this minor fix.

lu

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hi

I’d be interested in seeing your ffplay port,
I am a major contributor to ffmpeg4iphone, and we made a lot
of ffplay optimizations that you might be able to port to android.

Nice =), my ffplay port is pretty much replace lesson05.c with ffplay.c
and add the needed libs. (since that’s all you need to do).

As for these sdl ports, I have to agree with you, and I am a little shocked
that Sam would even consider the scoc2010 port to be anywhere near complete.

Agreed.

Its barely usable, the only code that seems to work is the lesson05.c sample
included, and that is not very portable.

The key issue is that you need somehow to wrap your app with the dalvik
bridge and that makes your like not that easy (ant+ndk-build).
Additionally Android.mk isn’t exactly easy to bend to your will.

As for the other port, well, if you build game that essentially clone the
same logic as alienbaster with variation than the author is correct it works,
but the code is so unique that it is not potable.

And from my current experience the overhead kills performances and the
audio subsystem isn’t exactly nice, but the main fault here is partially
on the android side.

to be useful as a reusable port, the port should prove the ability
to build all the sdl testcases, the graywin.c sample, and the simple sound samples, .

Agreed.

luOn 10/17/2010 08:09 PM, michelleC wrote:

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero

Unfortunately we don’t have the luxury of time and there is some major investment wrapped up in this project. So using the work we did with the iphone as a starting point I am looking at ways we can make the scoc2010 version work . Its not really a bad port its just very incomplete.

Our application is maybe a million lines or so of c code (best guess) and it would be a major endeavor to convert to java.

Sam is actually very helpful and we have the advantage he is familiar with our project. So there is hope.

I am wondering if using the toolchain rather than the ndk makes sense, although using the ndk I’ve managed to build our classes. Of course getting code to compile is only half the battle.

The sample program included helps a little, at least it shows how to set up the open gl from the java side.

both ports have issues so its pick one and go with it and I think I am better off with the more “official” one. But this is all big time groping through dark caves and I could easily change my mind.

it does help to go back through the mercuial archive , gives you a picture of were the developers are coming from. Some of the code is very similar to what was done in the iphone version.

If you like we can bounce ideas off each other, if we come up with a decent patch set I am sure we can get this pushed into the source tree.

(Right SAM :slight_smile: )

[

quote=“Luca Barbato”]On 10/17/2010 08:09 PM, michelleC wrote:

Hi

I’d be interested in seeing your ffplay port,
I am a major contributor to ffmpeg4iphone, and we made a lot
of ffplay optimizations that you might be able to port to android.

Nice =), my ffplay port is pretty much replace lesson05.c with ffplay.c
and add the needed libs. (since that’s all you need to do).

As for these sdl ports, I have to agree with you, and I am a little shocked
that Sam would even consider the scoc2010 port to be anywhere near complete.

Agreed.

Its barely usable, the only code that seems to work is the lesson05.c sample
included, and that is not very portable.

The key issue is that you need somehow to wrap your app with the dalvik
bridge and that makes your like not that easy (ant+ndk-build).
Additionally Android.mk isn’t exactly easy to bend to your will.

As for the other port, well, if you build game that essentially clone the
same logic as alienbaster with variation than the author is correct it works,
but the code is so unique that it is not potable.

And from my current experience the overhead kills performances and the
audio subsystem isn’t exactly nice, but the main fault here is partially
on the android side.

to be useful as a reusable port, the port should prove the ability
to build all the sdl testcases, the graywin.c sample, and the simple sound samples, .

Agreed.

lu–

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org[/quote]

Unfortunately we don’t have the luxury of time and there is some major investment wrapped
up in this project. So using the work we did with the iphone as a starting point I am
looking at ways we can make the scoc2010 version work.

From today investigation seems that:

  • Android_RenderPresent seems incomplete

Its not really a bad port its just very incomplete.

Agreed…

Our application is maybe a million lines or so of c code (best guess)
and it would be a major endeavor to convert to java.

I’m thinking about just replacing the YUV surface with a GLES surface
and use that one directly.

Sam is actually very helpful and we have the advantage he is familiar
with our project. So there is hope.

I am wondering if using the toolchain rather than the ndk makes sense,
although using the ndk I’ve managed to build our classes.

I experienced at least 3 different issues with the ndk:

  • It is inconsistent across the 3 platforms with linux working the best,
    macosx and window having problems even trying to build ffmpeg with neon
    enabled.
  • The produced ffmpeg segfaults on neon code (I should doublecheck the
    problem is unrelated)
  • the provided headers have errors (like using asm() instead of asm() )

Of course getting code to compile is only half the battle.

I’m already at this point (sort of)

The sample program included helps a little, at least it shows how to set
up the open gl from the java side.

From the ndk sample san-angeles, if you want to take over the whole
surface you will need even less contact with java than I’d expect.

both ports have issues so its pick one and go with it and I think I am better off
with the more “official” one. But this is all big time groping through dark caves
and I could easily change my mind.

it does help to go back through the mercuial archive , gives you a picture of were
the developers are coming from. Some of the code is very similar to what was done
in the iphone version.

If you like we can bounce ideas off each other, if we come up with a decent patch
set I am sure we can get this pushed into the source tree.

Would be great =)

luOn 10/18/2010 07:05 PM, michelleC wrote:

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero

Hi

Thanks for all your helpful comments.

Where did you get the version of ffplay you are using. most I’ve found are flawed badly. Please take a look at ffplay at http://code.google.com/p/ffmpeg4iphone/ we spent a long time making this work.

I suspect you will find that the segfaults have little to do with Android (although I may be wrong, not having tested on that platform) and more the fault of two many developers hands in the original ffmpeg codebase.

Luca Barbato wrote:> On 10/18/2010 07:05 PM, michelleC wrote:

Unfortunately we don’t have the luxury of time and there is some major investment wrapped
up in this project. So using the work we did with the iphone as a starting point I am
looking at ways we can make the scoc2010 version work.

From today investigation seems that:

  • Android_RenderPresent seems incomplete

Its not really a bad port its just very incomplete.

Agreed…

Our application is maybe a million lines or so of c code (best guess)
and it would be a major endeavor to convert to java.

I’m thinking about just replacing the YUV surface with a GLES surface
and use that one directly.

Sam is actually very helpful and we have the advantage he is familiar
with our project. So there is hope.

I am wondering if using the toolchain rather than the ndk makes sense,
although using the ndk I’ve managed to build our classes.

I experienced at least 3 different issues with the ndk:

  • It is inconsistent across the 3 platforms with linux working the best,
    macosx and window having problems even trying to build ffmpeg with neon
    enabled.
  • The produced ffmpeg segfaults on neon code (I should doublecheck the
    problem is unrelated)
  • the provided headers have errors (like using asm() instead of asm() )

Of course getting code to compile is only half the battle.

I’m already at this point (sort of)

The sample program included helps a little, at least it shows how to set
up the open gl from the java side.

From the ndk sample san-angeles, if you want to take over the whole
surface you will need even less contact with java than I’d expect.

both ports have issues so its pick one and go with it and I think I am better off
with the more “official” one. But this is all big time groping through dark caves
and I could easily change my mind.

it does help to go back through the mercuial archive , gives you a picture of were
the developers are coming from. Some of the code is very similar to what was done
in the iphone version.

If you like we can bounce ideas off each other, if we come up with a decent patch
set I am sure we can get this pushed into the source tree.

Would be great =)

lu

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hi

Thanks for all your helpful comments.

Where did you get the version of ffplay you are using.

Main FFmpeg svn or git…

most I’ve found are flawed badly. Please take a look at ffplay at
http://code.google.com/p/ffmpeg4iphone/ we spent a long time making this work.

Doing now.

I suspect you will find that the segfaults have little to do with Android
(although I may be wrong, not having tested on that platform) and more the fault
of two many developers hands in the original ffmpeg codebase.

Being ffmpeg upstream myself, I’d be glad if you fill our
roundup.ffmpeg.org with the bugs you found. The arm port is one of the
most tested (second to the x86/amd64 one) so I’m not that sure a bug
slipped there, but you never know ^^;

luOn 10/18/2010 09:42 PM, michelleC wrote:

Luca Barbato
Gentoo/linux
http://dev.gentoo.org/~lu_zero

And from my current experience the overhead kills performances and the

audio subsystem isn’t exactly nice, but the main fault here is partially

on the android side.

Okay, what overhead? Does audio lag or video is slow for you? Do you have
any ideas how to make it better? Or you don’t like overcomplicated build
system?
If you won’t specify clearly what you dislike (or submit bugreport) you will
never get it fixed.

I’m thinking about just replacing the YUV surface with a GLES surface
and use that one directly.

YUV surface means that SDL will be converting it it RGB565 surface anyway,
and then fill in GLES texture with pixel data. Don’t do that.

Quote:

Sam is actually very helpful and we have the advantage he is familiar
with our project. So there is hope.

I am wondering if using the toolchain rather than the ndk makes sense,
although using the ndk I’ve managed to build our classes.

I experienced at least 3 different issues with the ndk:

  • It is inconsistent across the 3 platforms with linux working the best,
    macosx and window having problems even trying to build ffmpeg with neon
    enabled.
  • The produced ffmpeg segfaults on neon code (I should doublecheck the
    problem is unrelated)
  • the provided headers have errors (like using asm() instead of asm() )

NDK is the official tool, and I’ll stick with it rather than using any
custom toolchain. Also there is still hope that Google will fix their NDK
over years.

The sample program included helps a little, at least it shows how to set
up the open gl from the java side.

From the ndk sample san-angeles, if you want to take over the whole
surface you will need even less contact with java than I’d expect.

I’ve seen some code where it used some hack to get eglSwapBuffers() called
directly from C code, avoiding Java altogether (maybe here:
http://github.com/drodin/TuxRider , or maybe in Kwaak3 port, don’t remember)
.
I’ve also used that demo as a start point.

Quote:

both ports have issues so its pick one and go with it and I think I am
better off
with the more “official” one. But this is all big time groping through dark
caves
and I could easily change my mind.

it does help to go back through the mercuial archive , gives you a picture
of were
the developers are coming from. Some of the code is very similar to what
was done
in the iphone version.

If you like we can bounce ideas off each other, if we come up with a decent
patch
set I am sure we can get this pushed into the source tree.

Would be great =)

Well, if you’ll manage to get my changes into SDL HG I’ll be happy, however
my port needs major cleanup for that to happen.

I have a task to write an app, by any standards its a very large and complex app. streaming tv. We’ve already done ports for the ios devices and are now focused on Android.
Sam Input has been greatly appreciated on those projects, he’s been a great help. If we come up with a reasonable patch I am sure it will find its way into sdl. (:slight_smile: I have not forgotten the iphone patches sam, you know how busy I am)

  1. Pela, you’ve done a lot of work and its appreciated, but you’ve taken a radical deviation from scoc2010 and I am not sure how to reconcile the two. I have yet to find a way to port our app to your implementation as it is currently using sdl 1.3’s way of doing audio and video.

  2. the book Po Android Games has been of immense help, they show how to initiate the open gl context on the java side , then swap buffers and continue on the c side. This is not far from how I had to do this on the iphone port.

  3. Luca, you gave me a good idea, :slight_smile:

I think I can make sdl work just fine for simple textures like we use for our ui throughout the app, and the theora / org format can be supported by ffmpeg, so purhaps using ffmpeg to handle the video and audio is the best approach. Licensing should not be the issue it is on Iphone because we can use shared libraries, and apps like rockplayer have proved the performance part of the equation.

The project I am working on is closed and proprietary but the sdl parts are open and intended to be shared, please let me know if you would like me to keep you abrest of our progress in this forum.

pelya wrote:>

And from my current experience the overhead kills performances and the
audio subsystem isn’t exactly nice, but the main fault here is partially
on the android side.

Okay, what overhead? Does audio lag or video is slow for you? Do you have any ideas how to make it better? Or you don’t like overcomplicated build system?
If you won’t specify clearly what you dislike (or submit bugreport) you will never get it fixed.

??

I’m thinking about just replacing the YUV surface with a GLES surface
and use that one directly.

YUV surface means that SDL will be converting it it RGB565 surface anyway, and then fill in GLES texture with pixel data. Don’t do that.
??

Quote:


Sam is actually very helpful and we have the advantage he is familiar

with our project. So there is hope.

I am wondering if using the toolchain rather than the ndk makes sense,
although using the ndk I’ve managed to build our classes.

I experienced at least 3 different issues with the ndk:

  • It is inconsistent across the 3 platforms with linux working the best,
    macosx and window having problems even trying to build ffmpeg with neon
    enabled.
  • The produced ffmpeg segfaults on neon code (I should doublecheck the
    problem is unrelated)
  • the provided headers have errors (like using asm() instead of asm() )

NDK is the official tool, and I’ll stick with it rather than using any custom toolchain. Also there is still hope that Google will fix their NDK over years.

??

The sample program included helps a little, at least it shows how to set

up the open gl from the java side.

From the ndk sample san-angeles, if you want to take over the whole
surface you will need even less contact with java than I’d expect.

I’ve seen some code where it used some hack to get eglSwapBuffers() called directly from C code, avoiding Java altogether (maybe here: http://github.com/drodin/TuxRider (http://github.com/drodin/TuxRider) , or maybe in Kwaak3 port, don’t remember) .
I’ve also used that demo as a start point.

Quote:


both ports have issues so its pick one and go with it and I think I am better off

with the more “official” one. But this is all big time groping through dark caves
and I could easily change my mind.

it does help to go back through the mercuial archive , gives you a picture of were
the developers are coming from. Some of the code is very similar to what was done
in the iphone version.

If you like we can bounce ideas off each other, if we come up with a decent patch
set I am sure we can get this pushed into the source tree.

Would be great =)

Well, if you’ll manage to get my changes into SDL HG I’ll be happy, however my port needs major cleanup for that to happen.

Ok, now I have a saner fix baking. SDL GLES works beside the BGR vs RGB
so what’s enough to get ffplay work is:

  • Use the GLES renderer
  • Force 16bit as video format
  • Make it use RGB (GLES bug?)

What’s missing then is fix orientation but that’s something that could
be addressed later.

Patchset following soon

luOn 10/19/10 2:15 PM, michelleC wrote:

I have a task to write an app, by any standards its a very large and complex app. streaming tv. We’ve already done ports for the ios devices and are now focused on Android.
Sam Input has been greatly appreciated on those projects, he’s been a great help. If we come up with a reasonable patch I am sure it will find its way into sdl. (:slight_smile: I have not forgotten the iphone patches sam, you know how busy I am)

  1. Pela, you’ve done a lot of work and its appreciated, but you’ve taken a radical deviation from scoc2010 and I am not sure how to reconcile the two. I have yet to find a way to port our app to your implementation as it is currently using sdl 1.3’s way of doing audio and video.

  2. the book Po Android Games has been of immense help, they show how to initiate the open gl context on the java side , then swap buffers and continue on the c side. This is not far from how I had to do this on the iphone port.

  3. Luca, you gave me a good idea, :slight_smile:

I think I can make sdl work just fine for simple textures like we use for our ui throughout the app, and the theora / org format can be supported by ffmpeg, so purhaps using ffmpeg to handle the video and audio is the best approach. Licensing should not be the issue it is on Iphone because we can use shared libraries, and apps like rockplayer have proved the performance part of the equation.

The project I am working on is closed and proprietary but the sdl parts are open and intended to be shared, please let me know if you would like me to keep you abrest of our progress in this forum.

Okay, what overhead?

SDL_UnlockYUVOverlay takes ages, same for SDL_DisplayYUVOverlay

Does audio lag or video is slow for you?

The audio subsystem triggers the cpu pegged warning (so it get silent
for 2s and then triggers again the message).

Do you have any ideas how to make it better?

For the audio probably using the same structure you have from alsa might
help, putting a crude sleep in the java side to avoid to have the buffer
overrun helps

Or you don’t like overcomplicated build system?

The build system is neat, I’ll ask you for help since I’d like to have
an Android.mk calling configure with the android cflags and compiler
definition for ffmpeg

If you won’t specify clearly what you dislike (or submit bugreport) you will
never get it fixed.

Sorry for not having stated the issues on your port, the whole thread
started with a minor patch to make the official halfport work.

luOn 10/19/10 10:47 AM, Sergiy Pylypenko wrote:

Okay, what overhead?

SDL_UnlockYUVOverlay takes ages, same for SDL_DisplayYUVOverlay

Yes, GLES does not support YUV surfaces as far as I know so SDL have to
convert pixels on the fly in software.

Does audio lag or video is slow for you?

The audio subsystem triggers the cpu pegged warning (so it get silent for
2s and then triggers again the message).

What device are you using? I’ve seen CPU pegged stuff mainly on ADP1 with
Android 1.6. Also CPU might be loaded with graphical stuff so it cannot run
audio thread in time, however in Android 2.0+ they’ve set correct priority
for audio thread so it should work okay even under load. You may notice that
in Audio.java in initAudioThread() I’m setting highest thread priority, yet
it fails on ADP1 and requires user to select huge audio buffer so audio
won’t become choppy.

Do you have any ideas how to make it better?

For the audio probably using the same structure you have from alsa might
help, putting a crude sleep in the java side to avoid to have the buffer
overrun helps

I’ve got some modifications to that last week, now user can select even
smaller audio buffer, at least from SDL side, Java will get bigger internal
buffer and will fill up it anyway. I’m planning to also make hack on Java
side to watch AudioTrack.
getPlaybackHeadPositionhttp://developer.android.com/reference/android/media/AudioTrack.html#getPlaybackHeadPosition()()
value and forcefully sleep for 10 ms if it’s too big, so Java internal
buffer will play out whatever it got.

I fear Alsa is not an option, since you can have access hardware only
through Java (or you need rooted phone, certainly not an option if you want
to distribute app through Market). If you’ll find working implementation
that uses alsa directly bypassing Java layer I’ll try to implement it, with
some fallback-to-Java mechanism.

Or you don’t like overcomplicated build system?

The build system is neat, I’ll ask you for help since I’d like to have an
Android.mk calling configure with the android cflags and compiler definition
for ffmpeg

Thanks.
There is script launchConfigure.sh which will help you to generate config.h
at least, however the chances that you fail to link it are high, especially
if you’re using ranlib.
Please look into lbreakout2 application and AndroidBuild.sh script inside it
to see how to use oldskool configure/make inside my build system (the
lbreakout2 itself compiles but does not work, because I’ve started to adding
HW acceleration and dropped it. However you may just follow instructions in
readme.txt and use unmodified lbreakout2 sources - it works okay, but FPS is
low. Also you need to uncomment lines 30-40 inside launchConfigure.sh to
rename libraries to what configure script expects).On Tue, Oct 19, 2010 at 7:03 PM, Luca Barbato <lu_zero at gentoo.org> wrote:
On 10/19/10 10:47 AM, Sergiy Pylypenko wrote:

Okay, what overhead?

SDL_UnlockYUVOverlay takes ages, same for SDL_DisplayYUVOverlay

Yes, GLES does not support YUV surfaces as far as I know so SDL have to
convert pixels on the fly in software.

It seems to be more related to the Lock/UnlockTexture than the yuv2rgb16

Hacking SDL to use the GLES renderer made the thing work in a more or
less decent.

What device are you using?

Reproduced on NexusOne and Galaxy S running 2.1 and 2.2…

I’ve seen CPU pegged stuff mainly on ADP1 with
Android 1.6. Also CPU might be loaded with graphical stuff so it cannot run
audio thread in time, however in Android 2.0+ they’ve set correct priority
for audio thread so it should work okay even under load. You may notice that
in Audio.java in initAudioThread() I’m setting highest thread priority, yet
it fails on ADP1 and requires user to select huge audio buffer so audio
won’t become choppy.

Putting a sleep made it somehow work.

I fear Alsa is not an option, since you can have access hardware only
through Java (or you need rooted phone, certainly not an option if you want
to distribute app through Market). If you’ll find working implementation
that uses alsa directly bypassing Java layer I’ll try to implement it, with
some fallback-to-Java mechanism.

I meant the idea of filling the buffer little by little instead of a
single large write.

There is script launchConfigure.sh which will help you to generate config.h
at least, however the chances that you fail to link it are high, especially
if you’re using ranlib.

I’ll have a look, thank you.

luOn 10/19/10 6:54 PM, Sergiy Pylypenko wrote:

On Tue, Oct 19, 2010 at 7:03 PM, Luca Barbato<@Luca_Barbato> wrote:

On 10/19/10 10:47 AM, Sergiy Pylypenko wrote:

My idea was to get the cflags and compiler from the Android.mk directly.
I think it’s feasible even if their name are nonstandard.On 10/19/10 7:04 PM, Luca Barbato wrote:

There is script launchConfigure.sh which will help you to generate
config.h
at least, however the chances that you fail to link it are high,
especially
if you’re using ranlib.

I’ll have a look, thank you.

There is script launchConfigure.sh which will help you to generate

config.h
at least, however the chances that you fail to link it are high,
especially
if you’re using ranlib.

I’ll have a look, thank you.

My idea was to get the cflags and compiler from the Android.mk directly. I
think it’s feasible even if their name are nonstandard.

That’s what I did to create launchConfigure.sh. If you’ll come up with
shell script or some other automated solution that will launch ndk-build or
parse all Android.mk files around and take compiler flags out of them I’ll
put it into my Git instead of launchConfigure.sh, currently
launchConfigure.sh is the best way I came up with (but when Google will
release another NDK I’ll have to update it).