Browse Source

Renamed Android package and added instructions on how to replace it with application-specific package name.
Updated documentation.

Lasse Öörni 13 years ago
parent
commit
02738d055e

+ 1 - 1
Android/AndroidManifest.xml

@@ -1,6 +1,6 @@
 <?xml version="1.0" encoding="utf-8"?>
 <?xml version="1.0" encoding="utf-8"?>
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"
 <manifest xmlns:android="http://schemas.android.com/apk/res/android"
-    package="org.libsdl.app"
+    package="com.googlecode.urho3d"
     android:versionCode="1"
     android:versionCode="1"
     android:versionName="1.0">
     android:versionName="1.0">
     <uses-permission android:name="android.permission.INTERNET" />
     <uses-permission android:name="android.permission.INTERNET" />

+ 1 - 6
Android/res/layout/main.xml

@@ -1,13 +1,8 @@
 <?xml version="1.0" encoding="utf-8"?>
 <?xml version="1.0" encoding="utf-8"?>
 <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
 <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
-    android:orientation="vertical"
+    android:orientation="horizontal"
     android:layout_width="fill_parent"
     android:layout_width="fill_parent"
     android:layout_height="fill_parent"
     android:layout_height="fill_parent"
     >
     >
-<TextView  
-    android:layout_width="fill_parent" 
-    android:layout_height="wrap_content" 
-    android:text="Hello World, SDLActivity"
-    />
 </LinearLayout>
 </LinearLayout>
 
 

+ 1 - 1
Android/src/org/libsdl/app/SDLActivity.java → Android/src/com/googlecode/urho3d/SDLActivity.java

@@ -1,6 +1,6 @@
 // Modified by Lasse Oorni for Urho3D
 // Modified by Lasse Oorni for Urho3D
 
 
-package org.libsdl.app;
+package com.googlecode.urho3d;
 
 
 import javax.microedition.khronos.egl.EGL10;
 import javax.microedition.khronos.egl.EGL10;
 import javax.microedition.khronos.egl.EGLConfig;
 import javax.microedition.khronos.egl.EGLConfig;

+ 26 - 12
Docs/GettingStarted.dox

@@ -14,6 +14,16 @@ Although all required third-party libraries are included as source code, there a
 
 
 - For Android, the Android SDK and Android NDK need to be installed.
 - For Android, the Android SDK and Android NDK need to be installed.
 
 
+To run Urho3D, the minimum system requirements are:
+
+- Windows: CPU with SSE instructions support, Windows XP or newer, DirectX 9.0c, GPU with %Shader %Model 2 support (%Shader %Model 3 recommended.)
+
+- Linux & Mac OS X: GPU with OpenGL 2.0 support and EXT_framebuffer_object, EXT_packed_depth_stencil and EXT_texture_filter_anisotropic extensions.
+
+- Android: OS version 2.2 or newer, OpenGL ES 2 capable GPU.
+
+For Windows, SSE and Windows XP requirements can be eliminated by disabling SSE, crash dump support and file watcher from the root CMakeLists.txt. Windows 2000 will then be the absolute minimum.
+
 \section Building_Desktop Desktop build process
 \section Building_Desktop Desktop build process
 
 
 Urho3D uses CMake (http://www.cmake.org) to build. The process has two steps:
 Urho3D uses CMake (http://www.cmake.org) to build. The process has two steps:
@@ -30,23 +40,27 @@ To run from the Visual Studio debugger, set the Urho3D project as the startup pr
 
 
 To actually make Urho3D.exe do something useful, it must be supplied with the name of the script file it should load and run. You can try for example the following arguments: Scripts/TestScene.as -w
 To actually make Urho3D.exe do something useful, it must be supplied with the name of the script file it should load and run. You can try for example the following arguments: Scripts/TestScene.as -w
 
 
-
 \section Building_Android Android build process
 \section Building_Android Android build process
 
 
-First build Urho3D for desktop OpenGL to make sure the GLSL shaders are generated. Then copy Bin/Data and Bin/CoreData directories to the Android/assets directory. Finally execute the following commands in the Android directory:
+First build Urho3D for desktop OpenGL to make sure the GLSL shaders are generated. For Windows this requires forcing OpenGL mode from the root CMakeLists.txt. Then copy Bin/Data and Bin/CoreData directories to the Android/assets directory. Finally execute the following commands in the Android directory:
 
 
 - android update project -p . (only needed on the first time)
 - android update project -p . (only needed on the first time)
 - ndk-build
 - ndk-build
-- ant debug (or ant release, but then you will have to sign the APK)
+- ant debug
 
 
-The APK should now have been generated to the Android/bin directory, from where you can install it on a device or an emulator.
+Note that ndk-build builds Urho3D twice, once without hardware floating point instructions, and once with them. After the commands finish successfully, the APK should have been generated to the Android/bin directory, from where it can be installed on a device or an emulator.
 
 
-Note that ndk-build builds Urho3D twice, once without hardware floating point instructions, and once with them.
+For a release build, use the "ant release" command instead of "ant debug" and follow the Android SDK instructions on how to sign your APK properly.
+
+By default the Android package for Urho3D is com.googlecode.urho3d. For a real application you must replace this with your own package name. Unfortunately the name has to be replaced in several files:
+
+- Android/AndroidManifest.xml
+- Android/src/com/googlecode/urho3d/SDLActivity.java (rename directories also)
+- ThirdParty/SDL/include/SDL_config_android.h, look for the NATIVE_FUNCTION macro
 
 
 
 
-\page Running Running Urho3D
 
 
-For Windows & Direct3D9 mode, Urho3D requires Windows XP or newer, DirectX 9.0c, and a display adapter with SM2.0 support. SM3.0 is highly recommended. For OpenGL mode, an OpenGL 2.0 capable display adapter with EXT_framebuffer_object, EXT_packed_depth_stencil and EXT_texture_filter_anisotropic extensions is required. On Android OS version 2.2 or newer and OpenGL ES 2 support is required.
+\page Running Running Urho3D
 
 
 The main executable Urho3D.exe in the Bin directory contains all the engine runtime functionality. However, it does not contain any inbuilt logic or application, and therefore must be supplied with the name of the application script file it should run:
 The main executable Urho3D.exe in the Bin directory contains all the engine runtime functionality. However, it does not contain any inbuilt logic or application, and therefore must be supplied with the name of the application script file it should run:
 
 
@@ -54,7 +68,7 @@ Urho3D.exe <scriptfilename> [options]
 
 
 The scripting language used is AngelScript (http://www.angelcode.com/angelscript); the script files have .as extension and need to be placed under either the Data or CoreData subdirectories so that Urho3D.exe can find them. An application script is required to have the function void Start(), which will be executed before starting the engine main loop. It is this function's responsibility to initialize the application and to hook up to any necessary \ref Events "events", such as the update that happens every frame.
 The scripting language used is AngelScript (http://www.angelcode.com/angelscript); the script files have .as extension and need to be placed under either the Data or CoreData subdirectories so that Urho3D.exe can find them. An application script is required to have the function void Start(), which will be executed before starting the engine main loop. It is this function's responsibility to initialize the application and to hook up to any necessary \ref Events "events", such as the update that happens every frame.
 
 
-On Android there are no command line options, so running the NinjaSnowWar example is hardcoded. This can be changed from Urho3D.cpp.
+On Android there are no command line options, so running the NinjaSnowWar example is hardcoded. This can be changed from the file Urho3D/Urho3D.cpp.
 
 
 Currently, five example application scripts exist:
 Currently, five example application scripts exist:
 
 
@@ -101,6 +115,10 @@ F4          Toggle octree debug geometry
 
 
 NinjaSnowWar also supports client/server multiplayer. To start the server, run the command NinjaSnowWar.bat server (-headless switch can optionally given so that the server will not open a graphics window.) To connect to a server, specify the server address on the command line, for example NinjaSnowWar.bat 127.0.0.1
 NinjaSnowWar also supports client/server multiplayer. To start the server, run the command NinjaSnowWar.bat server (-headless switch can optionally given so that the server will not open a graphics window.) To connect to a server, specify the server address on the command line, for example NinjaSnowWar.bat 127.0.0.1
 
 
+\section Running_Editor Editor
+
+%Scene editor application written in script. To start, run Editor.bat, or use the command Urho3D.exe Scripts/Editor.as
+
 \section Running_Chat Chat
 \section Running_Chat Chat
 
 
 Simple client-server chat test application. To start, run Chat.bat or ChatServer.bat in the Bin directory, or use the command Urho3D.exe Scripts/Chat.as
 Simple client-server chat test application. To start, run Chat.bat or ChatServer.bat in the Bin directory, or use the command Urho3D.exe Scripts/Chat.as
@@ -110,10 +128,6 @@ either press return or click "Send" to send them. Press ESC to exit.
 
 
 To connect automatically, the server address can also be given on the command line, for example Chat.bat 127.0.0.1
 To connect automatically, the server address can also be given on the command line, for example Chat.bat 127.0.0.1
 
 
-\section Running_Editor Editor
-
-%Scene editor application written in script. To start, run Editor.bat, or use the command Urho3D.exe Scripts/Editor.as
-
 For details on how to use the editor, see \ref EditorInstructions "Editor instructions."
 For details on how to use the editor, see \ref EditorInstructions "Editor instructions."
 
 
 \section Running_LightTest LightTest
 \section Running_LightTest LightTest

+ 33 - 13
Docs/Reference.dox

@@ -377,8 +377,8 @@ Screen resolution, fullscreen/windowed, vertical sync and hardware multisampling
 
 
 When setting the initial screen mode, Graphics does a few checks:
 When setting the initial screen mode, Graphics does a few checks:
 
 
-- For Direct3D9, the supported shader model is checked. 2.0 is minimum, but 3.0 will be used if available. %Shader model 2.0 can be forced by calling \ref Graphics::SetForceSM2 "SetForceSM2()".
-- For OpenGL, version 2.0 with EXT_framebuffer_object, EXT_packed_depth_stencil, EXT_texture_compression_s3tc and EXT_texture_filter_anisotropic extensions is checked for.
+- For Direct3D9, the supported shader model is checked. 2 is minimum, but 3 will be used if available. SM2 can be forced by calling \ref Graphics::SetForceSM2 "SetForceSM2()".
+- For OpenGL, version 2.0 with EXT_framebuffer_object, EXT_packed_depth_stencil and EXT_texture_filter_anisotropic extensions is checked for.
 - Are hardware shadow maps supported? Both ATI & NVIDIA style shadow maps can be used. If neither are available, no shadows will be rendered.
 - Are hardware shadow maps supported? Both ATI & NVIDIA style shadow maps can be used. If neither are available, no shadows will be rendered.
 - Are light pre-pass and deferred rendering modes supported? These require sufficient multiple rendertarget support, and either R32F texture format or readable hardware depth.
 - Are light pre-pass and deferred rendering modes supported? These require sufficient multiple rendertarget support, and either R32F texture format or readable hardware depth.
 
 
@@ -432,12 +432,20 @@ The following techniques will be used to reduce the amount of CPU and GPU work w
 
 
 - Software rasterized occlusion: after the octree has been queried for visible objects, the objects that are marked as occluders are rendered on the CPU to a small hierarchical-depth buffer, and it will be used to test the non-occluders for visibility. Use \ref Renderer::SetMaxOccluderTriangles "SetMaxOccluderTriangles()" and \ref Renderer::SetOccluderSizeThreshold "SetOccluderSizeThreshold()" to configure the occlusion rendering.
 - Software rasterized occlusion: after the octree has been queried for visible objects, the objects that are marked as occluders are rendered on the CPU to a small hierarchical-depth buffer, and it will be used to test the non-occluders for visibility. Use \ref Renderer::SetMaxOccluderTriangles "SetMaxOccluderTriangles()" and \ref Renderer::SetOccluderSizeThreshold "SetOccluderSizeThreshold()" to configure the occlusion rendering.
 
 
-- Hardware instancing (Direct3D9 SM3.0 only): rendering operations with the same geometry, material and light will be grouped together and performed as one draw call. Objects with a large amount of triangles will not be rendered as instanced, as that could actually be detrimental to performance. Use \ref Renderer::SetMaxInstanceTriangles "SetMaxInstanceTriangles()" to set the threshold. Note that even when instancing is not available, or the triangle count of objects is too large, they still benefit from the grouping, as render state only needs to be set once before rendering each group, reducing the CPU cost.
+- Hardware instancing (Direct3D9 SM3 only): rendering operations with the same geometry, material and light will be grouped together and performed as one draw call. Objects with a large amount of triangles will not be rendered as instanced, as that could actually be detrimental to performance. Use \ref Renderer::SetMaxInstanceTriangles "SetMaxInstanceTriangles()" to set the threshold. Note that even when instancing is not available, or the triangle count of objects is too large, they still benefit from the grouping, as render state only needs to be set once before rendering each group, reducing the CPU cost.
 
 
 - %Light stencil masking: in forward rendering, before objects lit by a spot or point light are re-rendered additively, the light's bounding shape is rendered to the stencil buffer to ensure pixels outside the light range are not processed.
 - %Light stencil masking: in forward rendering, before objects lit by a spot or point light are re-rendered additively, the light's bounding shape is rendered to the stencil buffer to ensure pixels outside the light range are not processed.
 
 
 Note that many more optimization opportunities are possible at the content level, for example using geometry & material LOD, grouping many static objects into one object for less draw calls, minimizing the amount of subgeometries (submeshes) per object for less draw calls, using texture atlases to avoid render state changes, using compressed (and smaller) textures, and setting maximum draw distances for objects, lights and shadows.
 Note that many more optimization opportunities are possible at the content level, for example using geometry & material LOD, grouping many static objects into one object for less draw calls, minimizing the amount of subgeometries (submeshes) per object for less draw calls, using texture atlases to avoid render state changes, using compressed (and smaller) textures, and setting maximum draw distances for objects, lights and shadows.
 
 
+\section Rendering_GPUResourceLoss Handling GPU resource loss
+
+On Direct3D9 and Android OpenGL ES 2 it is possible to lose the rendering context (and therefore GPU resources) due to the application window being minimized to the background. Also, to work around possible GPU driver bugs the desktop OpenGL context will be voluntarily destroyed and recreated when changing screen mode or toggling between fullscreen and windowed. Therefore, on all graphics APIs one must be prepared for losing GPU resources.
+
+Textures that have been loaded from a file, as well as vertex & index buffers that have shadowing enabled will restore their contents automatically, the rest have to be restored manually. On Direct3D9 non-dynamic (managed) textures and buffers will never be lost, as the runtime automatically backs them up to system memory.
+
+See \ref VertexBuffer::IsDataLost "IsDataLost()" function in VertexBuffer, IndexBuffer, Texture2D and TextureCube classes for detecting data loss. Inbuilt classes such as Model, BillboardSet and Font already handle data loss for their internal GPU resources, so checking for it is only necessary for custom buffers and textures. Watch out especially for trying to render with an index buffer that has uninitialized data after a loss, as this can cause a crash inside the GPU driver due to referencing non-existent (garbage) vertices.
+
 \section Rendering_Further Further details
 \section Rendering_Further Further details
 
 
 See also \ref Materials "Materials", \ref Lights "Lights and shadows", \ref SkeletalAnimation "Skeletal animation", \ref Particles "Particle systems", \ref Postprocessing "Post-processing", \ref Zones "Zones", and \ref AuxiliaryViews "Auxiliary views".
 See also \ref Materials "Materials", \ref Lights "Lights and shadows", \ref SkeletalAnimation "Skeletal animation", \ref Particles "Particle systems", \ref Postprocessing "Post-processing", \ref Zones "Zones", and \ref AuxiliaryViews "Auxiliary views".
@@ -449,7 +457,7 @@ See \ref APIDifferences "Differences between Direct3D9 and OpenGL" for what to w
 
 
 \page RenderingModes Rendering modes
 \page RenderingModes Rendering modes
 
 
-Urho3D implements both forward, light pre-pass and deferred rendering modes. Where they differ is how per-pixel lighting is calculated for opaque objects; transparent objects always use forward rendering.
+Urho3D implements both forward, light pre-pass and deferred rendering modes. Where they differ is how per-pixel lighting is calculated for opaque objects; transparent objects always use forward rendering. Note that on OpenGL ES 2 only forward rendering is available.
 
 
 \section RenderingModes_Forward Forward rendering
 \section RenderingModes_Forward Forward rendering
 
 
@@ -496,6 +504,12 @@ Finally note that due to OpenGL framebuffer object limitations an extra framebuf
 
 
 \page APIDifferences Differences between Direct3D9 and OpenGL
 \page APIDifferences Differences between Direct3D9 and OpenGL
 
 
+These differences need to be observed when using the low-level rendering functionality directly. The high-level rendering architecture, including the Renderer and UI subsystems and the Drawable subclasses already handle most of them transparently to the user.
+
+- The post-projection depth range is (0,1) for Direct3D9 and (-1,1) for OpenGL. The Camera can be queried either for an API-specific or API-independent (Direct3D9 convention) projection matrix.
+
+- To render with 1:1 texel-to-pixel mapping, on Direct3D9 UV coordinates have to be shifted a half-pixel to the right and down, or alternatively vertex positions can be shifted a half-pixel left and up.
+
 - On Direct3D9 the depth-stencil surface can be equal or larger in size than the color rendertarget. On OpenGL the sizes must always match. Furthermore, OpenGL can not use the backbuffer depth-stencil surface when rendering to a texture. To overcome these limitations, Graphics will create correctly sized depth-stencil surfaces on demand whenever a texture is set as a color rendertarget, and a null depth-stencil is specified.
 - On Direct3D9 the depth-stencil surface can be equal or larger in size than the color rendertarget. On OpenGL the sizes must always match. Furthermore, OpenGL can not use the backbuffer depth-stencil surface when rendering to a texture. To overcome these limitations, Graphics will create correctly sized depth-stencil surfaces on demand whenever a texture is set as a color rendertarget, and a null depth-stencil is specified.
 
 
 - On Direct3D9 the viewport will be reset to full size when the first color rendertarget is changed. On OpenGL this does not happen. To ensure correct operation on both APIs, always use this sequence: first set the rendertargets, then the depth-stencil surface and finally the viewport.
 - On Direct3D9 the viewport will be reset to full size when the first color rendertarget is changed. On OpenGL this does not happen. To ensure correct operation on both APIs, always use this sequence: first set the rendertargets, then the depth-stencil surface and finally the viewport.
@@ -506,14 +520,17 @@ Finally note that due to OpenGL framebuffer object limitations an extra framebuf
 
 
 - %Shader resources are stored in different locations depending on the API: CoreData/Shaders/SM2 or CoreData/Shaders/SM3 for Direct3D9, and CoreData/Shaders/GLSL for OpenGL.
 - %Shader resources are stored in different locations depending on the API: CoreData/Shaders/SM2 or CoreData/Shaders/SM3 for Direct3D9, and CoreData/Shaders/GLSL for OpenGL.
 
 
-- On OpenGL there is never a "device lost" condition, which would cause dynamic textures or vertex/index buffers to lose their contents. However, when the screen mode is changed, the context (along with all GPU resources) will be manually destroyed and recreated. This would be strictly necessary only when changing the multisampling mode, but as bugs may otherwise occur with some GPU drivers, it is best to do for any mode change.
-
 - At least for now, instancing is not supported for OpenGL. It still benefits from the instance group rendering loop, which only changes the model transform for each object with the same material and light, instead of setting the whole renderstate.
 - At least for now, instancing is not supported for OpenGL. It still benefits from the instance group rendering loop, which only changes the model transform for each object with the same material and light, instead of setting the whole renderstate.
 
 
 - To ensure similar UV addressing for render-to-texture viewports on both APIs, on OpenGL texture viewports will be rendered upside down.
 - To ensure similar UV addressing for render-to-texture viewports on both APIs, on OpenGL texture viewports will be rendered upside down.
 
 
-Note that these differences only need to be observed when writing custom rendering functionality and accessing Graphics directly. When using Renderer and the Drawable components, they are taken care of automatically.
+OpenGL ES 2 has further limitations:
+
+- Only DXT1 compressed textures will be uploaded as compressed if the EXT_texture_compression_dxt1 extension is present. Other compressed texture formats will be uploaded as uncompressed RGBA. Mobile hardware specific compression formats such as ETC or PVRTC are not yet supported.
+
+- %Light pre-pass and deferred rendering are not supported due to missing multiple rendertarget support, and limited rendertarget formats.
 
 
+- Due to texture unit limit (usually 8), point light shadow maps are not supported.
 
 
 \page Materials Materials
 \page Materials Materials
 
 
@@ -759,13 +776,16 @@ The Input subsystem provides keyboard and mouse input via both a polled interfac
 
 
 The input events include:
 The input events include:
 
 
-- E_MOUSEBUTTONUP: a mouse button has been released.
-- E_MOUSEBUTTONDOWN: a mouse button has been pressed.
-- E_MOUSEMOVE: the mouse has been moved.
-- E_MOUSEWHEEL: the mouse wheel has been moved.
-- E_KEYUP: a key has been released.
-- E_KEYDOWN: a key has been pressed.
+- E_MOUSEBUTTONUP: a mouse button was released.
+- E_MOUSEBUTTONDOWN: a mouse button was pressed.
+- E_MOUSEMOVE: the mouse moved.
+- E_MOUSEWHEEL: the mouse wheel moved.
+- E_KEYUP: a key was released.
+- E_KEYDOWN: a key was pressed.
 - E_CHAR: translation of a keypress to Unicode charset for text entry. This is currently the only way to get translated key input.
 - E_CHAR: translation of a keypress to Unicode charset for text entry. This is currently the only way to get translated key input.
+- E_TOUCHBEGIN: a finger touched the screen.
+- E_TOUCHEND: a finger was lifted from the screen.
+- E_TOUCHMOVE: a finger moved on the screen.
 
 
 The input polling API differentiates between the initiation of a key/mouse button press, and holding the key or button down. \ref Input::GetKeyPress "GetKeyPress()" and \ref Input::GetMouseButtonPress "GetMouseButtonPress()" return true only for one frame (the initiation) while \ref Input::GetKeyDown "GetKeyDown()" and \ref Input::GetMouseButtonDown "GetMouseButtonDown()" return true as long as the key or button is held down.
 The input polling API differentiates between the initiation of a key/mouse button press, and holding the key or button down. \ref Input::GetKeyPress "GetKeyPress()" and \ref Input::GetMouseButtonPress "GetMouseButtonPress()" return true only for one frame (the initiation) while \ref Input::GetKeyDown "GetKeyDown()" and \ref Input::GetMouseButtonDown "GetMouseButtonDown()" return true as long as the key or button is held down.
 
 

+ 3 - 1
Engine/Graphics/OpenGL/OGLGraphics.cpp

@@ -291,9 +291,11 @@ bool Graphics::SetMode(int width, int height, bool fullscreen, bool vsync, bool
             Release(true, true);
             Release(true, true);
             return false;
             return false;
         }
         }
-        #endif
         
         
         compressedTextureSupport_ = CheckExtension("EXT_texture_compression_s3tc");
         compressedTextureSupport_ = CheckExtension("EXT_texture_compression_s3tc");
+        #else
+        compressedTextureSupport_ = CheckExtension("EXT_texture_compression_dxt1");
+        #endif
     }
     }
     
     
     // Set vsync
     // Set vsync

+ 36 - 10
Readme.txt

@@ -91,6 +91,20 @@ successfully:
 
 
 - For Android, the Android SDK and Android NDK need to be installed.
 - For Android, the Android SDK and Android NDK need to be installed.
 
 
+To run Urho3D, the minimum system requirements are:
+
+- Windows: CPU with SSE instructions support, Windows XP or newer, DirectX 9.0c,
+  GPU with Shader Model 2 support (Shader Model 3 recommended.)
+
+- Linux & Mac OS X: GPU with OpenGL 2.0 support and EXT_framebuffer_object,
+  EXT_packed_depth_stencil and EXT_texture_filter_anisotropic extensions.
+
+- Android: OS version 2.2 or newer, OpenGL ES 2 capable GPU.
+
+For Windows, SSE and Windows XP requirements can be eliminated by disabling SSE,
+crash dump support and file watcher from the root CMakeLists.txt. Windows 2000
+will then be the absolute minimum.
+
 
 
 Desktop build process
 Desktop build process
 ---------------------
 ---------------------
@@ -132,20 +146,32 @@ following arguments: Scripts/TestScene.as -w
 Android build process
 Android build process
 ---------------------
 ---------------------
 
 
-First build Urho3D for desktop OpenGL to make sure the GLSL shaders are
-generated. Then copy Bin/Data and Bin/CoreData directories to the
-Android/assets directory. Finally execute the following commands in the 
+First build Urho3D for desktop OpenGL to make sure the GLSL shaders are 
+generated. For Windows this requires forcing OpenGL mode from the root 
+CMakeLists.txt. Then copy Bin/Data and Bin/CoreData directories to the
+Android/assets directory. Finally execute the following commands in the
 Android directory:
 Android directory:
 
 
-android update project -p . (only needed on the first time)
-ndk-build
-ant debug (or ant release, but then you will have to sign the APK)
-
-The APK should now have been generated to the Android/bin directory, from where 
-you can install it on a device or an emulator.
+- android update project -p . (only needed on the first time)
+- ndk-build
+- ant debug
 
 
 Note that ndk-build builds Urho3D twice, once without hardware floating point
 Note that ndk-build builds Urho3D twice, once without hardware floating point
-instructions, and once with them.
+instructions, and once with them. After the commands finish successfully, the
+APK should have been generated to the Android/bin directory, from where it can
+be installed on a device or an emulator.
+
+For a release build, use the "ant release" command instead of "ant debug" and
+follow the Android SDK instructions on how to sign your APK properly.
+
+By default the Android package for Urho3D is com.googlecode.urho3d. For a real
+application you must replace this with your own package name. Unfortunately the
+name has to be replaced in several files:
+
+- Android/AndroidManifest.xml
+- Android/src/com/googlecode/urho3d/SDLActivity.java (rename directories also)
+- ThirdParty/SDL/include/SDL_config_android.h, look for the NATIVE_FUNCTION
+  macro
 
 
 
 
 History
 History

+ 5 - 0
ThirdParty/SDL/include/SDL_config_android.h

@@ -19,6 +19,8 @@
   3. This notice may not be removed or altered from any source distribution.
   3. This notice may not be removed or altered from any source distribution.
 */
 */
 
 
+// Modified by Lasse Öörni for Urho3D
+
 #ifndef _SDL_config_android_h
 #ifndef _SDL_config_android_h
 #define _SDL_config_android_h
 #define _SDL_config_android_h
 
 
@@ -130,4 +132,7 @@
 #define SDL_VIDEO_RENDER_OGL_ES	1
 #define SDL_VIDEO_RENDER_OGL_ES	1
 #define SDL_VIDEO_RENDER_OGL_ES2	1
 #define SDL_VIDEO_RENDER_OGL_ES2	1
 
 
+/* Define Java package/class name here */
+#define NATIVE_FUNCTION(name) Java_com_googlecode_urho3d_SDLActivity_ ## name
+
 #endif /* _SDL_config_minimal_h */
 #endif /* _SDL_config_minimal_h */

+ 11 - 11
ThirdParty/SDL/src/core/android/SDL_android.cpp

@@ -147,7 +147,7 @@ extern "C" void SDL_Android_Init(JNIEnv* env, jclass cls, jstring filesDir)
 }
 }
 
 
 // Resize
 // Resize
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeResize(
+extern "C" void NATIVE_FUNCTION(onNativeResize)(
                                     JNIEnv* env, jclass jcls,
                                     JNIEnv* env, jclass jcls,
                                     jint width, jint height, jint format)
                                     jint width, jint height, jint format)
 {
 {
@@ -155,21 +155,21 @@ extern "C" void Java_org_libsdl_app_SDLActivity_onNativeResize(
 }
 }
 
 
 // Keydown
 // Keydown
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeKeyDown(
+extern "C" void NATIVE_FUNCTION(onNativeKeyDown)(
                                     JNIEnv* env, jclass jcls, jint keycode)
                                     JNIEnv* env, jclass jcls, jint keycode)
 {
 {
     Android_OnKeyDown(keycode);
     Android_OnKeyDown(keycode);
 }
 }
 
 
 // Keyup
 // Keyup
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeKeyUp(
+extern "C" void NATIVE_FUNCTION(onNativeKeyUp)(
                                     JNIEnv* env, jclass jcls, jint keycode)
                                     JNIEnv* env, jclass jcls, jint keycode)
 {
 {
     Android_OnKeyUp(keycode);
     Android_OnKeyUp(keycode);
 }
 }
 
 
 // Touch
 // Touch
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeTouch(
+extern "C" void NATIVE_FUNCTION(onNativeTouch)(
                                     JNIEnv* env, jclass jcls,
                                     JNIEnv* env, jclass jcls,
                                     jint touch_device_id_in, jint pointer_finger_id_in,
                                     jint touch_device_id_in, jint pointer_finger_id_in,
                                     jint action, jfloat x, jfloat y, jfloat p)
                                     jint action, jfloat x, jfloat y, jfloat p)
@@ -178,7 +178,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_onNativeTouch(
 }
 }
 
 
 // Accelerometer
 // Accelerometer
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeAccel(
+extern "C" void NATIVE_FUNCTION(onNativeAccel)(
                                     JNIEnv* env, jclass jcls,
                                     JNIEnv* env, jclass jcls,
                                     jfloat x, jfloat y, jfloat z)
                                     jfloat x, jfloat y, jfloat z)
 {
 {
@@ -189,7 +189,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_onNativeAccel(
 }
 }
 
 
 // Quit
 // Quit
-extern "C" void Java_org_libsdl_app_SDLActivity_nativeQuit(
+extern "C" void NATIVE_FUNCTION(nativeQuit)(
                                     JNIEnv* env, jclass cls)
                                     JNIEnv* env, jclass cls)
 {    
 {    
     // Inject a SDL_QUIT event
     // Inject a SDL_QUIT event
@@ -202,7 +202,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_nativeQuit(
 }
 }
 
 
 // Pause
 // Pause
-extern "C" void Java_org_libsdl_app_SDLActivity_nativePause(
+extern "C" void NATIVE_FUNCTION(nativePause)(
                                     JNIEnv* env, jclass cls)
                                     JNIEnv* env, jclass cls)
 {
 {
     SDL_Event event;
     SDL_Event event;
@@ -221,7 +221,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_nativePause(
 }
 }
 
 
 // Resume
 // Resume
-extern "C" void Java_org_libsdl_app_SDLActivity_nativeResume(
+extern "C" void NATIVE_FUNCTION(nativeResume)(
                                     JNIEnv* env, jclass cls)
                                     JNIEnv* env, jclass cls)
 {
 {
     SDL_Event event;
     SDL_Event event;
@@ -239,7 +239,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_nativeResume(
     }
     }
 }
 }
 
 
-extern "C" void Java_org_libsdl_app_SDLActivity_nativeRunAudioThread(
+extern "C" void NATIVE_FUNCTION(nativeRunAudioThread)(
                                     JNIEnv* env, jclass cls)
                                     JNIEnv* env, jclass cls)
 {
 {
     /* This is the audio thread, with a different environment */
     /* This is the audio thread, with a different environment */
@@ -249,7 +249,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_nativeRunAudioThread(
 }
 }
 
 
 // Surface destroyed
 // Surface destroyed
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeSurfaceDestroyed(
+extern "C" void NATIVE_FUNCTION(onNativeSurfaceDestroyed)(
                                     JNIEnv* env, jclass cls)
                                     JNIEnv* env, jclass cls)
 {
 {
     if (Android_Window) {
     if (Android_Window) {
@@ -258,7 +258,7 @@ extern "C" void Java_org_libsdl_app_SDLActivity_onNativeSurfaceDestroyed(
 }
 }
 
 
 // Surface created
 // Surface created
-extern "C" void Java_org_libsdl_app_SDLActivity_onNativeSurfaceCreated(
+extern "C" void NATIVE_FUNCTION(onNativeSurfaceCreated)(
                                     JNIEnv* env, jclass cls)
                                     JNIEnv* env, jclass cls)
 {
 {
     if (Android_Window) {
     if (Android_Window) {

+ 1 - 1
ThirdParty/SDL/src/main/android/SDL_android_main.cpp

@@ -16,7 +16,7 @@
 extern "C" void SDL_Android_Init(JNIEnv* env, jclass cls, jstring filesDir);
 extern "C" void SDL_Android_Init(JNIEnv* env, jclass cls, jstring filesDir);
 
 
 // Start up the SDL app
 // Start up the SDL app
-extern "C" void Java_org_libsdl_app_SDLActivity_nativeInit(JNIEnv* env, jclass cls, jstring filesDir)
+extern "C" void NATIVE_FUNCTION(nativeInit)(JNIEnv* env, jclass cls, jstring filesDir)
 {
 {
     /* This interface could expand with ABI negotiation, calbacks, etc. */
     /* This interface could expand with ABI negotiation, calbacks, etc. */
     SDL_Android_Init(env, cls, filesDir);
     SDL_Android_Init(env, cls, filesDir);