Reference.dox 92 KB

1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283848586878889909192939495969798991001011021031041051061071081091101111121131141151161171181191201211221231241251261271281291301311321331341351361371381391401411421431441451461471481491501511521531541551561571581591601611621631641651661671681691701711721731741751761771781791801811821831841851861871881891901911921931941951961971981992002012022032042052062072082092102112122132142152162172182192202212222232242252262272282292302312322332342352362372382392402412422432442452462472482492502512522532542552562572582592602612622632642652662672682692702712722732742752762772782792802812822832842852862872882892902912922932942952962972982993003013023033043053063073083093103113123133143153163173183193203213223233243253263273283293303313323333343353363373383393403413423433443453463473483493503513523533543553563573583593603613623633643653663673683693703713723733743753763773783793803813823833843853863873883893903913923933943953963973983994004014024034044054064074084094104114124134144154164174184194204214224234244254264274284294304314324334344354364374384394404414424434444454464474484494504514524534544554564574584594604614624634644654664674684694704714724734744754764774784794804814824834844854864874884894904914924934944954964974984995005015025035045055065075085095105115125135145155165175185195205215225235245255265275285295305315325335345355365375385395405415425435445455465475485495505515525535545555565575585595605615625635645655665675685695705715725735745755765775785795805815825835845855865875885895905915925935945955965975985996006016026036046056066076086096106116126136146156166176186196206216226236246256266276286296306316326336346356366376386396406416426436446456466476486496506516526536546556566576586596606616626636646656666676686696706716726736746756766776786796806816826836846856866876886896906916926936946956966976986997007017027037047057067077087097107117127137147157167177187197207217227237247257267277287297307317327337347357367377387397407417427437447457467477487497507517527537547557567577587597607617627637647657667677687697707717727737747757767777787797807817827837847857867877887897907917927937947957967977987998008018028038048058068078088098108118128138148158168178188198208218228238248258268278288298308318328338348358368378388398408418428438448458468478488498508518528538548558568578588598608618628638648658668678688698708718728738748758768778788798808818828838848858868878888898908918928938948958968978988999009019029039049059069079089099109119129139149159169179189199209219229239249259269279289299309319329339349359369379389399409419429439449459469479489499509519529539549559569579589599609619629639649659669679689699709719729739749759769779789799809819829839849859869879889899909919929939949959969979989991000100110021003100410051006100710081009101010111012101310141015101610171018101910201021102210231024102510261027102810291030103110321033103410351036103710381039104010411042104310441045104610471048104910501051105210531054105510561057105810591060106110621063106410651066106710681069107010711072107310741075107610771078107910801081108210831084108510861087108810891090109110921093109410951096109710981099110011011102110311041105110611071108110911101111111211131114111511161117111811191120112111221123112411251126112711281129113011311132113311341135113611371138113911401141114211431144114511461147114811491150115111521153115411551156115711581159116011611162116311641165116611671168116911701171117211731174117511761177117811791180118111821183118411851186118711881189119011911192119311941195119611971198119912001201120212031204120512061207120812091210121112121213121412151216121712181219122012211222122312241225122612271228122912301231123212331234123512361237123812391240124112421243124412451246124712481249125012511252125312541255125612571258125912601261126212631264126512661267126812691270127112721273127412751276127712781279128012811282128312841285128612871288128912901291129212931294129512961297129812991300130113021303130413051306130713081309131013111312131313141315131613171318131913201321132213231324132513261327132813291330133113321333133413351336133713381339134013411342134313441345134613471348134913501351135213531354135513561357135813591360136113621363136413651366136713681369137013711372137313741375137613771378137913801381
  1. /**
  2. \page Containers Container types
  3. Urho3D implements its own string type and template containers instead of using STL. The rationale for this consists of the following:
  4. - Increased performance in some cases, for example when using the PODVector class.
  5. - Reduced size of each string or vector instance compared to the MSVC STL implementations.
  6. - Reduced compile time.
  7. - Straightforward naming and implementation that aids in debugging and profiling.
  8. - Convenient member functions can be added, for example String::Split() or Vector::Compact().
  9. - Consistency with the rest of the classes, see \ref CodingConventions "Coding conventions".
  10. The classes in question are String, Vector, PODVector, List, Set, Map, HashSet and HashMap. PODVector is only to be used when the elements of the vector need no construction or destruction and can be moved with a block memory copy.
  11. The list, set and map classes use a fixed-size allocator internally. This can also be used by the application, either by using the procedural functions AllocatorInitialize(), AllocatorUninitialize(), AllocatorReserve() and AllocatorFree(), or through the template class Allocator.
  12. In script, the String class is exposed as it is. The template containers can not be directly exposed to script, but instead a template Array type exists, which behaves like a Vector, but does not expose iterators. In addition the VariantMap is available, which is a Map<ShortStringHash, Variant>.
  13. \page ObjectTypes %Object types and factories
  14. Classes that derive from Object contain type-identification, they can be created through object factories, and they can send and receive \ref Events "events". Examples of these are all Component, Resource and UIElement subclasses. To be able to be constructed by a factory, they need to have a constructor that takes a Context pointer as the only parameter.
  15. %Object factory registration and object creation through factories are directly accessible only in C++, not in script.
  16. The definition of an Object subclass must contain the OBJECT(className) macro. Type identification is available both as text (GetTypeName() or GetTypeNameStatic()) and as a 16-bit hash of the type name (GetType() or GetTypeStatic()).
  17. In addition the OBJECTTYPESTATIC(className) macro must appear in a .cpp file to actually define the type identification data. The reason for this instead of defining the data directly inside the OBJECT macro as function-static data is thread safety: function-static data is initialized on the first use, and if the first call to an object's GetTypeStatic() or GetTypeNameStatic() happened on several threads simultaneously, the results would be undefined.
  18. To register an object factory for a specific type, call the \ref Context::RegisterFactory "RegisterFactory()" template function on Context. You can get its pointer from any Object either via the \ref Object::context_ "context_" member variable, or by calling \ref Object::GetContext "GetContext()". An example:
  19. \code
  20. context_->RegisterFactory<MyClass>();
  21. \endcode
  22. To create an object using a factory, call Context's \ref Context::CreateObject "CreateObject()" function. This takes the 16-bit hash of the type name as a parameter. The created object (or null if there was no matching factory registered) will be returned inside a SharedPtr<Object>. For example:
  23. \code
  24. SharedPtr<Object> newComponent = context_->CreateObject(type));
  25. \endcode
  26. \page Subsystems Subsystems
  27. Any Object can be registered to the Context as a subsystem, by using the function \ref Context::RegisterSubsystem "RegisterSubsystem()". They can then be accessed by any other Object inside the same context by calling \ref Object::GetSubsystem "GetSubsystem()". Only one instance of each object type can exist as a subsystem.
  28. After Engine initialization, the following subsystems will always exist:
  29. - Time: manages frame updates, frame number and elapsed time counting, and controls the frequency of the operating system low-resolution timer.
  30. - WorkQueue: executes background tasks in worker threads.
  31. - FileSystem: provides directory operations.
  32. - Log: provides logging services.
  33. - ResourceCache: loads resources and keeps them cached for later access.
  34. - Network: provides UDP networking and scene replication.
  35. - Input: handles keyboard and mouse input. Will be inactive in headless mode.
  36. - UI: the graphical user interface. Will be inactive in headless mode.
  37. - Audio: provides sound output. Will be inactive if sound disabled.
  38. - Engine: creates the other subsystems and controls the main loop iteration and framerate limiting.
  39. The following subsystems are optional, so GetSubsystem() may return null if they have not been created:
  40. - Profiler: Provides hierarchical function execution time measurement using the operating system performance counter. Exists if profiling has been compiled in (configurable from the root CMakeLists.txt)
  41. - Graphics: Manages the application window, the rendering context and resources. Exists if not in headless mode.
  42. - Renderer: Renders scenes in 3D and manages rendering quality settings. Exists if not in headless mode.
  43. - Script: Provides the AngelScript execution environment. Created by calling \ref Engine::InitializeScripting "InitializeScripting()".
  44. - Console: provides an interactive AngelScript console and log display. Created by calling \ref Engine::CreateConsole "CreateConsole()".
  45. - DebugHud: displays rendering mode information and statistics and profiling data. Created by calling \ref Engine::CreateDebugHud "CreateDebugHud()".
  46. In script, the subsystems are available through the following global properties:
  47. time, fileSystem, log, cache, network, input, ui, audio, engine, graphics, renderer, script, console, debugHud. Note that WorkQueue and Profiler are not available to script due to their low-level nature.
  48. \page Events Events
  49. The Urho3D event system allows for data transport and function invocation without the sender and receiver having to explicitly know of each other. It supports both broadcast and targeted events. Both the event sender and receiver must derive from Object. An event receiver must subscribe to each event type it wishes to receive: one can either subscribe to the event coming from any sender, or from a specific sender. The latter is useful for example when handling events from the user interface elements.
  50. Events themselves do not need to be registered. They are identified by 32-bit hashes of their names. Event parameters (the data payload) are optional and are contained inside a VariantMap, identified by 16-bit parameter name hashes. For the inbuilt Urho3D events, event type (E_UPDATE, E_KEYDOWN, E_MOUSEMOVE etc.) and parameter hashes (P_TIMESTEP, P_DX, P_DY etc.) are defined as constants inside include files such as CoreEvents.h or InputEvents.h.
  51. When subscribing to an event, a handler function must be specified. In C++ these must have the signature void HandleEvent(StringHash eventType, VariantMap& eventData). The HANDLER(className, function) macro helps in defining the required class-specific function pointers. For example:
  52. \code
  53. SubscribeToEvent(E_UPDATE, HANDLER(MyClass, MyEventHandler));
  54. \endcode
  55. In script events are identified by their string names instead of name hashes (though these are internally converted to hashes.) Script event handlers can either have the same signature as in C++, or a simplified signature void HandleEvent() when event type and parameters are not required. The same event subscription would look like:
  56. \code
  57. SubscribeToEvent("Update", "MyEventHandler");
  58. \endcode
  59. In C++ events must always be handled by a member function. In script procedural event handling is also possible; in this case the ScriptFile where the event handler function is located becomes the event receiver. See \ref Scripting "Scripting" for more details.
  60. To send an event, fill the event parameters (if necessary) and call \ref Object::SendEvent "SendEvent()". For example, this (in C++) is how the Time subsystem sends the Update event. Note how for the inbuilt Urho3D events, the parameter name hashes are always put inside a namespace (the event's name) to prevent name clashes:
  61. \code
  62. using namespace Update;
  63. VariantMap eventData;
  64. eventData[P_TIMESTEP] = timeStep_;
  65. SendEvent(E_UPDATE, eventData);
  66. \endcode
  67. In script event parameters, like event types, are referred to with strings, so the same code would look like:
  68. \code
  69. VariantMap eventData;
  70. eventData["TimeStep"] = timeStep;
  71. SendEvent("Update", eventData);
  72. \endcode
  73. Events can also be unsubscribed from. See \ref Object::UnsubscribeFromEvent "UnsubscribeFromEvent()" for details.
  74. \page MainLoop Main loop and frame update
  75. The main loop iteration (also called a frame) is driven by the Engine. In contrast it is the program's (for example Urho3D.exe) responsibility to continuously loop this iteration. The iteration consists of the Engine calling the Time subsystem's \ref Time::BeginFrame "BeginFrame()" and \ref Time::EndFrame "EndFrame()" functions, and in between sending various update events. The event order is:
  76. - E_BEGINFRAME: signals the beginning of the new frame. Input and Network react to this to check for operating system window messages and arrived network packets.
  77. - E_UPDATE: application-wide logic update event. By default each active Scene reacts to this and triggers the scene update (more on this below.)
  78. - E_POSTUPDATE: application-wide logic post-update event. The UI subsystem updates its logic here.
  79. - E_RENDERUPDATE: Renderer updates its viewports here to prepare for rendering, and the UI generates render commands necessary to render the user interface.
  80. - E_POSTRENDERUPDATE: by default nothing hooks to this. This can be used to implement logic that requires the rendering views to be up-to-date (for example to do accurate raycasts.) Scenes may not be modified at this point (especially scene objects may not be deleted or crashes may occur.)
  81. - E_ENDFRAME: signals the end of the frame. Before this, rendering the frame and measuring the next frame's timestep will have occurred.
  82. The update of each Scene causes further events to be sent:
  83. - E_SCENEUPDATE: variable timestep scene update. This is a good place to implement any scene logic that does not need to happen at a fixed step.
  84. - E_SCENESUBSYSTEMUPDATE: update scene-wide subsystems. Currently only the PhysicsWorld component listens to this, which causes it to step the physics simulation and send the following two events for each simulation step:
  85. - E_PHYSICSPRESTEP: called before the simulation iteration. Happens at a fixed rate (the physics FPS.) If fixed timestep logic updates are needed, this is a good event to listen to.
  86. - E_PHYSICSPOSTSTEP: called after the simulation iteration. Happens at the same rate as E_PHYSICSPRESTEP.
  87. - E_SCENEPOSTUPDATE: variable timestep scene post-update. ParticleEmitter and AnimationController update themselves as a response to this event.
  88. Variable timestep logic updates are preferable to fixed timestep, because they are only executed once per frame. In contrast, if the rendering framerate is low, several physics world simulation steps will be performed on each frame to keep up the apparent passage if time, and if this also causes a lot of logic code to be executed for each step, the program may bog down further if the CPU can not handle the load.
  89. \page SceneModel %Scene model
  90. Urho3D's scene model can be described as a component-based scene graph. The Scene consists of a hierarchy of scene nodes, starting from the root node, which also represents the whole scene. Each Node has a 3D transform (position, rotation and scale), a name and an ID, and a freeform VariantMap for \ref Node::GetVars "user variables", but no other functionality.
  91. \section SceneModel_Components Components
  92. Rendering 3D objects, sound playback, physics and scripted logic updates are all enabled by creating different \ref Component "Components" into the nodes by calling \ref Node::CreateComponent "CreateComponent()". As with events, in C++ components are identified by type name hashes, and template forms of the component creation and retrieval functions exist for convenience. For example:
  93. \code
  94. Light* light = lightNode->CreateComponent<Light>();
  95. \endcode
  96. In script, strings are used to identify component types instead, so the same code would look like:
  97. \code
  98. Light@ light = lightNode.CreateComponent("Light");
  99. \endcode
  100. Because components are created using \ref ObjectTypes "object factories", a factory must be registered for each component type.
  101. Components created into the Scene itself have a special role: to implement scene-wide functionality. They should be created before all other components, and include the following:
  102. - Octree: implements spatial partitioning and accelerated visibility queries. Without this 3D objects can not be rendered.
  103. - PhysicsWorld: implements physics simulation. Physics components such as RigidBody or CollisionShape can not function properly without this.
  104. - DebugRenderer: implements debug geometry rendering.
  105. "Ordinary" components like Light, Camera or StaticModel should not be created directly into the Scene, but rather into child nodes.
  106. \section SceneModel_Identification Identification and scene hierarchy
  107. Unlike nodes, components do not have names; components inside the same node are only identified by their type, and index in the node's component list, which is filled in creation order. See the various overloads of \ref Node::GetComponent "GetComponent()" or \ref Node::GetComponents "GetComponents()" for details.
  108. When created, both nodes and components get scene-global integer IDs. They can be queried from the Scene by using the functions \ref Scene::GetNodeByID "GetNodeByID()" and \ref Scene::GetComponentByID "GetComponentByID()". This is much faster than for example doing recursive name-based scene node queries.
  109. There is no inbuilt concept of an entity or a game object; rather it is up to the programmer to decide the node hierarchy, and in which nodes to place any scripted logic. Typically, free-moving objects in the 3D world would be created as children of the root node. Nodes can be created either with or without a name, see \ref Node::CreateChild "CreateChild()". Uniqueness of node names is not enforced.
  110. Whenever there is some hierarchical composition, it is recommended (and in fact necessary, because components do not have their own 3D transforms) to create a child node. For example if a character was holding an object in his hand, the object should have its own node, which would be parented to the character's hand bone (also a Node.) The exception is the physics CollisionShape, which can be offsetted and rotated individually in relation to the node. See \ref Physics "Physics" for more details.
  111. %Scene nodes can be freely reparented. In contrast components are always created to the node they belong to, and can not be moved between nodes. Both child nodes and components are stored using SharedPtr containers; this means that detaching a child node from its parent or removing a component will also destroy it, if no other references to it exist. Both Node & Component provide the \ref Node::Remove() "Remove()" function to accomplish this without having to go through the parent. Note that no operations on the node or component in question are safe after calling that function.
  112. It is also legal to create a Node that does not belong to a scene. This is particularly useful with cameras, because then the camera will not be serialized along with the actual scene, which is perhaps not always wanted.
  113. \section SceneModel_Update Scene updates and serialization
  114. A Scene can be either active or inactive (paused.) Active scenes will be automatically updated on each main loop iteration.
  115. Scenes can be loaded and saved in either binary or XML format; see \ref Serialization "Serialization" for details.
  116. \section SceneModel_FurtherInformation Further information
  117. For more information on the component-based scene model, see for example http://cowboyprogramming.com/2007/01/05/evolve-your-heirachy/.
  118. \page Resources Resources
  119. Resources include most things in Urho3D that are loaded from mass storage during initialization or runtime:
  120. - Animation
  121. - Image
  122. - Model
  123. - Material
  124. - ScriptFile
  125. - Shader
  126. - Sound
  127. - Technique
  128. - Texture2D
  129. - TextureCube
  130. - XMLFile
  131. They are managed and loaded by the ResourceCache subsystem. Like with all other \ref ObjectTypes "typed objects", resource types are identified by 16-bit type name hashes (C++) or type names (script). An object factory must be registered for each resource type.
  132. The resources themselves are identified by their file paths, relative to the registered resource directories or \ref PackageFile "package files". By default, Urho3D.exe registers the resource directories Data and CoreData, or the packages Data.pak and CoreData.pak if they exist.
  133. If loading a resource fails, an error will be logged and a null pointer is returned.
  134. Typical C++ example of requesting a resource from the cache, in this case, a texture for a UI element. Note the use of a convenience template argument to specify the resource type, instead of using the type hash.
  135. \code
  136. healthBar->SetTexture(GetSubsystem<ResourceCache>()->GetResource<Texture2D>("Textures/HealthBarBorder.png"));
  137. \endcode
  138. The same in script would look like this (note the use of a property instead of a setter function):
  139. \code
  140. healthBar.texture = cache.GetResource("Texture2D", "Textures/HealthBarBorder.png");
  141. \endcode
  142. Resources can also be created manually and stored to the resource cache as if they had been loaded from disk.
  143. Memory budgets can be set per resource type: if resources consume more memory than allowed, the oldest resources will be removed from the cache if not in use anymore. By default the memory budgets are set to unlimited.
  144. \page Scripting Scripting
  145. There are three ways the AngelScript language can be interacted with in Urho3D:
  146. \section Scripting_Immediate Immediate execution
  147. Immediate execution takes one line of AngelScript, compiles it, and executes. This is not recommended for anything that needs high performance, but can be used for example to implement a developer console. Call the Script subsystem's \ref Script::Execute "Execute()" function to use. For example:
  148. \code
  149. GetSubsystem<Script>()->Execute("Print(\"Hello World!\");");
  150. \endcode
  151. It may be useful to be able to access a specific scene or a script file while executing immediate script code. These can be set on the Script subsystem by calling \ref Script::SetDefaultScene "SetDefaultScene()" and \ref Script::SetDefaultScriptFile "SetDefaultScriptFile()".
  152. \section Scripting_Procedural Calling a function from a script file
  153. This requires a successfully loaded ScriptFile resource, whose \ref ScriptFile::Execute "Execute()" function will be used. To identify the function to be called, its full declaration is needed. Parameters are passed in a VariantVector. For example:
  154. \code
  155. ScriptFile* file = GetSubsystem<ResourceCache>()->GetResource<ScriptFile>("Scripts/MyScript.as");
  156. VariantVector parameters;
  157. parameters.Push(Variant(100)); // Add an int parameter
  158. file->Execute("void MyFunction(int)", parameters); // Execute
  159. \endcode
  160. \ref ScriptFile::Execute "Execute()" also has an overload which takes a function pointer instead of querying by declaration. Using a pointer is naturally faster than a query, but note that the query results are also stored to an internal cache, so repeated queries for the same declaration do not need to go to the AngelScript module level each time. Storing function pointers is risky in case the ScriptFile resource is reloaded, because then the pointers will be invalidated.
  161. \section Scripting_Object Instantiating a script object
  162. The component ScriptInstance can be used to instantiate a specific class from within a script file. After this the script object can respond to scene updates, \ref Events "events" and \ref Serialization "serialization" much like a component written in C++ would do, if it has the appropriate methods implemented. For example:
  163. \code
  164. ScriptInstance* instance = node->CreateComponent<ScriptInstance>();
  165. instance->CreateObject(GetSubsystem<ResourceCache>()->GetResource<ScriptFile>("Scripts/MyClass.as"), "MyClass");
  166. \endcode
  167. The class must implement the empty interface ScriptObject, so that the object can also be accessed from script using ScriptInstance's \ref ScriptInstance::GetScriptObject "GetScriptObject()" function.
  168. The following methods that implement the component behaviour will be checked for. None of them are required.
  169. - void Start()
  170. - void Stop()
  171. - void Update(float)
  172. - void PostUpdate(float)
  173. - void FixedUpdate(float)
  174. - void FixedPostUpdate(float)
  175. - void Save(Serializer&)
  176. - void Load(Deserializer&)
  177. - void ApplyAttributes()
  178. The update methods above correspond to the variable timestep scene update and post-update, and the fixed timestep physics world update and post-update. The application-wide update events are not handled by default.
  179. The Start() and Stop() methods do not have direct counterparts in C++ components. Start() is called just after the script object has been created. Stop() is called just before the script object is destroyed. This happens when the ScriptInstance is destroyed, or if the script class is changed.
  180. Subscribing to \ref Events "events" in script behaves differently depending on whether \ref Object::SubscribeToEvent "SubscribeToEvent()" is called from a script object's method, or from a procedural script function. If called from an object method, the ScriptInstance becomes the event receiver on the C++ side, and forwards the events to the script object. If called from a function, the ScriptFile will be the event receiver.
  181. The script object's active/inactive state can be controlled through the \ref ScriptInstance::SetActive "SetActive()" function. When inactive, the scripted update methods or event handlers will not be called. This can be used to reduce CPU load in a large or densely populated scene.
  182. There are shortcut methods on the script side for creating and accessing a node's script object: node.CreateScriptObject() and node.GetScriptObject() (alternatively, if the node has only one ScriptInstance, and a specific class is not needed, the node's scriptObject property can also be used.) These are not actual Node member functions on the C++ side. CreateScriptObject() takes the script file name (or alternatively, a ScriptFile object handle) and class name as parameters and creates a ScriptInstance component automatically, then creates the script object. For example:
  183. \code
  184. ScriptObject@ object = node.CreateScriptObject("Scripts/MyClass.as", "MyClass");
  185. \endcode
  186. \section Script_ScriptAPI The script API
  187. Much of the Urho3D classes are exposed to scripts, however things that require low-level access or high performance (like direct vertex buffer access) are not. Also for scripting convenience some things have been changed from the C++ API:
  188. - The template array and string classes are exposed as Array<type> and String.
  189. - Public member variables are exposed without the underscore appended. For example x, y, z in Vector3.
  190. - Whenever only a single parameter is needed, setter and getter functions are replaced with properties. Such properties start with a lowercase letter. If an index parameter is needed, the property will be indexed. Indexed properties are in plural.
  191. - The element count property of arrays and other dynamic structures such as VariantMap and ResourceRefList is called "length", though the corresponding C++ function is usually Size().
  192. - Subsystems exist as global properties: time, fileSystem, log, cache, network, input, ui, audio, engine, graphics, renderer, script, console, debugHud.
  193. - Additional global properties exist for accessing the script object's node, the scene and the scene-wide components: node, scene, octree, physicsWorld, debugRenderer. When an object method is not executing, these are null. An exception: when the default scene for immediate execution has been set by calling \ref Script::SetDefaultScene "SetDefaultScene()", it is always available as "scene".
  194. - The currently executing script object's ScriptInstance component is available through the global property self.
  195. - The currently executing script file is available through the global property scriptFile.
  196. - The first script object created to a node is available as its scriptObject property.
  197. - Printing raw output to the log is simply called Print(). The rest of the logging functions are accessed by calling log.Debug(), log.Info(), log.Warning() and log.Error().
  198. - Functions that would take a StringHash or ShortStringHash parameter usually take a string instead. For example sending events, requesting resources and accessing components.
  199. - Most of StringUtils have been exposed as methods of the string class. For example String.ToBool().
  200. - Template functions for getting components or resources by type are not supported. Instead automatic type casts are performed as necessary.
  201. \section Scripting_Limitations Limitations
  202. There are some complexities of the scripting system one has to watch out for:
  203. - During the execution of the script object's constructor, the object is not yet associated with the ScriptInstance, and therefore subscribing to events, or trying to access the node or scene will fail. The use of the constructor is best reserved for initializing member variables only.
  204. - There is a maximum allowed nesting level (currently 32) for execution that moves between C++ & AngelScript. Nested execution typically occurs if you send an event to another ScriptInstance from a scripted event handler. If the nesting level is exceeded, an error will be logged and the script code that would have required the extra nesting level will not be executed.
  205. - When the resource request for a particular ScriptFile is initially made, the script file and the files it includes are compiled into an AngelScript script module. Each script module has its own class hierarchy that is not usable from other script modules, unless the classes are declared shared. See AngelScript documentation for more details.
  206. - If a ScriptFile resource is reloaded, all the script objects created from it will be destroyed, then recreated. They will lose any stored state as their constructors and Start() methods will be run again. This is rarely useful when running an actual game, but may be helpful during development.
  207. \section Scripting_Modifications AngelScript modifications
  208. The following changes have been made to AngelScript in Urho3D:
  209. - For performance reasons and to guarantee immediate removal of expired objects, AngelScript garbage collection has been disabled for script classes and the Array type. This has the downside that circular references will not be detected. Therefore, whenever you have object handles in your script, think of them as if they were C++ shared pointers and avoid creating circular references with them.
  210. - %Object handle assignment can be done without the @ symbol if the object in question does not support value assignment. All exposed Urho3D C++ classes that derive from RefCounted never support value assignment. For example, when assigning the Model and Material of a StaticModel component:
  211. \code
  212. object.model = cache.GetResource("Model", "Models/Mushroom.mdl");
  213. object.material = cache.GetResource("Material", "Materials/Mushroom.xml");
  214. \endcode
  215. In unmodified AngelScript, this would have to be written as:
  216. \code
  217. @object.model = cache.GetResource("Model", "Models/Mushroom.mdl");
  218. @object.material = cache.GetResource("Material", "Materials/Mushroom.xml");
  219. \endcode
  220. \page Rendering Rendering
  221. Much of the rendering functionality in Urho3D is built on two subsystems, Graphics and Renderer, contained within the %Graphics library.
  222. \section Rendering_Graphics Graphics
  223. Graphics implements the low-level functionality:
  224. - Creating the window and the rendering context
  225. - Setting the screen mode
  226. - Keeping track of GPU resources
  227. - Keeping track of rendering context state (current rendertarget, vertex and index buffers, textures, shaders and renderstates)
  228. - Handling lost device
  229. - Performing primitive rendering operations
  230. Screen resolution, fullscreen/windowed, vertical sync and hardware multisampling level are all set at once by calling Graphics's \ref Graphics::SetMode "SetMode()" function.
  231. When setting the initial screen mode, Graphics does a few checks:
  232. - For Direct3D9, the supported shader model is checked. 2.0 is minimum, but 3.0 will be used if available. %Shader model 2.0 can be forced by calling \ref Graphics::SetForceSM2() "SetForceSM2()".
  233. - For OpenGL, version 2.0 with EXT_framebuffer_object and EXT_packed_depth_stencil extensions is checked for.
  234. - Are hardware shadow maps supported? Both ATI & NVIDIA style shadow maps can be used. If neither are available, a fallback mode using a color texture and minimal shadow filtering will be chosen instead.
  235. \section Rendering_Renderer Renderer
  236. Renderer implements the actual rendering of 3D views each frame, and controls global settings such as texture quality, material quality, specular lighting and shadow map base resolution.
  237. To render, it needs a Scene with an Octree component, and a Camera that does not necessarily have to belong to the scene. The octree stores all visible components (derived from Drawable) to allow querying for them in an accelerated manner. The scene, camera and screen rectangle to use are set with Renderer's \ref Renderer::SetViewport "SetViewport()" function.
  238. By default there is one viewport, but the amount can be increased with the function \ref Renderer::SetNumViewports "SetNumViewports()". The viewport(s) should cover the entire screen or otherwise hall-of-mirrors artifacts may occur. By specifying a zero screen rectangle the whole window will be used automatically. The viewports will be rendered in ascending order, so if you want for example to have a small overlay window on top of the main viewport, use viewport index 0 for the main view, and 1 for the overlay.
  239. The steps for rendering each viewport on each frame are roughly the following:
  240. - Query the octree for visible objects and lights in the camera's view frustum.
  241. - Check the influence of each visible light on the objects. If the light casts shadows, query the octree for shadowcaster geometries.
  242. - Construct render operations (batches) for the visible objects.
  243. - Perform these render operations during the rendering step at the end of the frame.
  244. The rendering operations are divided into passes in the following order:
  245. - Opaque geometry ambient pass.
  246. - Opaque geometry lighting passes. For shadow casting lights, the shadow map is rendered first.
  247. - Post-opaque or "extra" rendering pass for materials that define that.
  248. - Transparent geometry rendering pass. Transparent, alpha-blended objects are sorted according to distance and rendered back-to-front to ensure correct blending.
  249. \section Rendering_Drawable Rendering components
  250. The rendering-related components defined by the %Graphics library are:
  251. - Octree: spatial partitioning of Drawables for accelerated visibility queries. Needs to be created to the Scene (root node.)
  252. - Camera: describes a viewpoint for rendering, including projection parameters (FOV, near/far distance, perspective/orthographic)
  253. - Drawable: Base class for anything visible.
  254. - StaticModel: non-skinned geometry. Can LOD transition according to distance.
  255. - Skybox: a subclass of StaticModel that appears to always stay in place.
  256. - AnimatedModel: skinned geometry that can do skeletal and vertex morph animation.
  257. - AnimationController: drives AnimatedModel's animations forward automatically and controls animation fade-in/out.
  258. - BillboardSet: a group of camera-facing billboards, which can have varying sizes, rotations and texture coordinates.
  259. - ParticleEmitter: a subclass of BillboardSet that emits particle billboards.
  260. - Light: illuminates the scene. Can optionally cast shadows.
  261. - Zone: defines ambient light and fog settings for objects inside the zone volume.
  262. \section Rendering_Optimizations Optimizations
  263. The following techniques will be used to reduce the amount of CPU and GPU work when rendering. By default they are all on:
  264. - Software rasterized occlusion: after the octree has been queried for visible objects, the objects that are marked as occluders are rendered on the CPU to a small hierarchical-depth buffer, and it will be used to test the non-occluders for visibility. Use \ref Renderer::SetMaxOccluderTriangles() "SetMaxOccluderTriangles()" and \ref Renderer::SetOccluderSizeThreshold() "SetOccluderSizeThreshold()" to configure the occlusion rendering.
  265. - Hardware instancing (Direct3D9 SM3.0 only): rendering operations with the same geometry, material and light will be grouped together and performed as one draw call. Objects with a large amount of triangles will not be rendered as instanced, as that could actually be detrimental to performance. Use \ref Renderer::SetMaxInstanceTriangles() "SetMaxInstanceTriangles()" to set the threshold. Note that even when instancing is not available, or the triangle count of objects is too large, they still benefit from the grouping, as render state only needs to be set once before rendering each group, reducing the CPU cost.
  266. - %Light stencil masking: before rendering objects lit by a spot or point light, the light's bounding shape is rendered to the stencil buffer to ensure pixels outside the light range are not processed.
  267. Note that many more optimization opportunities are possible at the content level, for example using geometry & material LOD, grouping many static objects into one object for less draw calls, minimizing the amount of subgeometries (submeshes) per object for less draw calls, using texture atlases to avoid render state changes, using compressed (and smaller) textures, and setting maximum draw distances for objects, lights and shadows.
  268. \section Rendering_Further Further details
  269. See also \ref Materials "Materials", \ref Lights "Lights and shadows", \ref Particles "Particle systems" and \ref AuxiliaryViews "Auxiliary views".
  270. For details on how Direct3D9 and OpenGL rendering differs, see \ref APIDifferences "Differences between Direct3D9 and OpenGL".
  271. \page APIDifferences Differences between Direct3D9 and OpenGL
  272. - On Direct3D9 shader uniform parameters are global. On OpenGL they are shader-specific. To ensure correct operation also on OpenGL, first set the shaders, then query Graphics whether each shader parameter is needed, and set the corresponding parameter if needed. This includes also frame-global parameters, which on Direct3D9 need to be set only once per frame, if the uniform register assignments are fixed.
  273. - On OpenGL vertex attribute bindings also depend on the currently set shaders. To ensure correct operation, first set the shaders, then the vertex buffers.
  274. - On Direct3D9 the depth stencil surface can be equal or larger in size than the color render target. On OpenGL the sizes must always match. Furthermore, OpenGL can not use the system depth stencil buffer when rendering to a texture. To overcome these limitations, Graphics will create correctly sized depth stencil buffers on demand whenever a texture is set as a color render target, and a null depth stencil is specified.
  275. - On Direct3D9 setting the first color render target resets the viewport dimensions. On OpenGL there is no such mechanism, but as sometimes (for example in shadow rendering) only a depth stencil buffer is set for rendering, Graphics will instead reset the viewport when the depth stencil buffer is set. To ensure correct operation on both APIs, first set the render targets, then the depth stencil buffer, and finally the viewport, if you need it to be less than the whole render target.
  276. - On OpenGL modifying a texture will cause it to be momentarily set on the first texture unit. If another texture was set there, the assignment will be lost. Graphics performs a check to not assign textures redundantly, so it is safe and recommended to always set all needed textures before rendering.
  277. - Modifying an index buffer on OpenGL will similarly cause the existing index buffer assignment to be lost. Therefore, always set the vertex and index buffers before rendering.
  278. - Shader resources are stored in different locations depending on the API: CoreData/Shaders/SM2 or CoreData/Shaders/SM3 on Direct3D9, and CoreData/Shaders/GLSL for OpenGL.
  279. - On OpenGL there is never a "device lost" condition, which would cause dynamic textures or vertex/index buffers to lose their contents. However, when the screen mode is changed, the context (along with all GPU resources) will be manually destroyed and recreated. This would be strictly necessary only when changing the multisampling mode, but as bugs may otherwise occur with some GPU drivers, it is best to do for any mode change.
  280. - At least for now, instancing is not supported for OpenGL. It still benefits from the instance group rendering loop, which only changes the model transform for each object with the same material and light, instead of setting the whole renderstate.
  281. Note that these differences only need to be observed when writing custom rendering functionality and accessing Graphics directly. When using Renderer and the Drawable components, they are taken care of automatically.
  282. \page Materials Materials
  283. Material and Technique resources define how to render 3D scene geometry. On the disk, they are XML data. By default, materials exist in the CoreData/Materials & Data/Materials subdirectories, and techniques exist in the CoreData/Techniques subdirectory.
  284. A material defines the textures, shader parameters and culling mode to use, and refers to techniques. A technique defines the actual rendering passes, the shaders to use in each, and all other rendering states such as depth test, depth write, and blending.
  285. A material definition looks like this:
  286. \code
  287. <material>
  288. <technique name="TechniqueName" quality="q" loddistance="d" sm3="true|false" />
  289. <texture unit="diffuse|normal|specular|detail|environment|emissive" name="TextureName" />
  290. <texture ... />
  291. <parameter name="name" value="x y z w" />
  292. <parameter ... />
  293. <cull value="cw|ccw|none" />
  294. <shadowcull value="cw|ccw|none" />
  295. </material>
  296. \endcode
  297. %Technique quality levels are specified from 0 (low) to 2 (high). When rendering, the highest available technique that does not exceed the Renderer's material quality setting will be chosen, see \ref Renderer::SetMaterialQuality() "SetMaterialQuality()". If a technique requires SM3.0-only shaders, it can be marked as such by the "sm3" attribute.
  298. When a material defines several techniques for different LOD levels and quality settings, they must appear in a specific order:
  299. - Most distant & highest quality
  300. - ...
  301. - Most distant & lowest quality
  302. - Second most distant & highest quality
  303. - ...
  304. %Material shader parameters can be floats or vectors up to 4 components. Matrix parameters are not supported.
  305. Default culling mode is counterclockwise. The shadowcull element specifies the culling mode to use in the shadow pass.
  306. \section Materials_Textures Material textures
  307. Diffuse maps specify the surface color in the RGB channels. Optionally they can use the alpha channel for blending and alpha testing. They should preferably be compressed to DXT1 (no alpha or 1-bit alpha) or DXT5 (smooth alpha) format.
  308. Normal maps encode the tangent-space surface normal for normal mapping. They need to be stored as xGxR, ie. Y-component in the green channel, and X-component in the alpha (Z will be reconstructed in the pixel shader.) This encoding lends itself well to DXT5 compression. To convert normal maps to this format, you can use AMD's The Compressonator utility, see http://developer.amd.com/tools/compressonator/pages/default.aspx
  309. Specular maps use only the G channel to specify specular intensity. DXT1 format should suit these well.
  310. \section Materials_Techniques Techniques and passes
  311. A technique definition looks like this:
  312. \code
  313. <technique>
  314. <pass name="base|litbase|light|prealpha|postalpha|shadow" vs="VertexShaderName" ps="PixelShaderName"
  315. alphatest="true|false" blend="replace|add|multiply|alpha|addalpha|premulalpha|invdestalpha"
  316. depthtest="always|equal|less|lessequal|greater|greaterequal" depthwrite="true|false" />
  317. <pass ... />
  318. <pass ... />
  319. </technique>
  320. \endcode
  321. The passes are:
  322. - base: renders the ambient light and fog.
  323. - litbase: renders both the ambient light and the first light affecting the object for optimization.
  324. - light: renders one light's contribution additively.
  325. - prealpha: custom rendering pass after opaque geometry. Can be used for example to render the skybox.
  326. - postalpha: custom rendering pass after transparent geometry.
  327. - shadow: shadow map rendering pass. Renders depth only.
  328. By default draw calls within passes are sorted by render state, but transparent base and light passes, as well as the postalpha pass, are sorted by distance back to front.
  329. Note that the technique does not need to enumerate shaders used for different geometry types (non-skinned, skinned, instanced, billboard) and light types (directional, point and spot, specular and no specular, shadowed and non-shadowed.) Instead specific hardcoded shader variations are assumed to exist. See the file Forward.xml in either SourceAssets/HLSLShaders or SourceAssets/GLSLShaders for all the shader variations.
  330. \page Lights Lights and shadows
  331. Lights in Urho3D can be directional, point, or spot lights. Shadow mapping is supported for all light types.
  332. A directional light's position has no effect, as it's assumed to be infinitely far away, only its rotation matters. It casts orthographically projected shadows. For increasing the shadow quality, cascaded shadow mapping (splitting the view into several shadow maps along the Z-axis) can be used.
  333. Point lights are spherical in shape. When a point light casts shadows, it will be internally split into 6 spot lights with a 90 degree FOV each. This is very expensive rendering-wise, so shadow casting point lights should be used sparingly.
  334. Spot lights have FOV & aspect ratio values like cameras to define the shape of the light cone.
  335. Both point and spot lights use an attenuation ramp texture to determine how the intensity varies with distance. In addition they have a shape texture, 2D for spot lights, and an optional cube texture for point lights. It is important that the spot light's shape texture has black at the borders, and has mipmapping disabled, otherwise there will be "bleeding" artifacts at the edges of the light cone.
  336. \section Lights_LightCulling Light culling
  337. When occlusion is used, a light will automatically be culled if its bounding box is fully behind an occluder. However, directional lights have an infinite bounding box, and can not be culled this way.
  338. It is possible to limit which objects are affected by each light, by calling \ref Drawable::SetLightMask "SetLightMask()" on both the light and the objects. The lightmasks of the light and objects are ANDed to check whether the light should have effect: the light will only illuminate an object if the result is nonzero. By default objects and lights have all bits set in their lightmask, thus passing this test always.
  339. \ref Zone "Zones" can also be used for light culling. When an object is inside a zone, its lightmask will be ANDed with the zone's lightmask before testing it against the lights' lightmasks. Using this mechanism, objects can change their accepted light set dynamically as they move through the scene.
  340. Care must be utilized when doing light culling with lightmasks, because they easily create situations where a light's influence is cut off unnaturally. However, they can be helpful in preventing light spill into undesired areas, for example lights inside one room bleeding into another, without having to resort into shadow-casting lights.
  341. \section Lights_ShadowedLights Shadowed lights
  342. Shadow rendering is easily the most complex aspect of using lights, and therefore a wide range of per-light parameters exists for controlling the shadows:
  343. - BiasParameters: define constant and slope-scaled depth bias values for preventing self-shadowing artifacts. In practice, need to be determined experimentally. Orthographic (directional) and projective (point and spot) shadows may require rather different bias values. Another way of fighting self-shadowing issues is to render shadowcaster backfaces, see \ref Rendering_Materials "Materials".
  344. - CascadeParameters: these have effect only for directional lights. They specify the far clip distance of each of the cascaded shadow map splits (maximum 4), and the fade start point relative to the maximum shadow range. Unused splits can be set to far clip 0.
  345. - FocusParameters: these have effect for directional and spot lights, and control techniques to increase shadow map resolution. They consist of focus enable flag (allows focusing the shadow camera on the visible shadow casters & receivers), nonuniform scale enable flag (allows better resolution), automatic size reduction flag (reduces shadow map resolution when the light is far away), and quantization & minimum size parameters for the shadow camera view.
  346. Additionally there are shadow fade distance, shadow intensity, shadow resolution and shadow near/far ratio parameters:
  347. - If both shadow distance and shadow fade distance are greater than zero, shadows start to fade at the shadow fade distance, and vanish completely at the shadow distance.
  348. - Shadow intensity defines how dark the shadows are, between 0.0 (maximum darkness, the default) and 1.0 (fully lit.)
  349. - The shadow resolution parameter scales the global shadow map size set in Renderer to determine the actual shadow map size. Maximum is 1.0 (full size) and minimum is 0.125 (one eighth size.) Choose according to the size and importance of the light; smaller shadow maps will be less performance hungry.
  350. - The shadow near/far ratio controls shadow camera near clip distance for point & spot lights. The default ratio is 0.002, which means a light with range 100 would have its shadow camera near plane set at the distance of 0.2. Set this as high as you can for better shadow depth resolution, but note that the bias parameters will likely have to be adjusted as well.
  351. Finally, there are global settings for the shadow map base resolution and shadow map depth (16 or 24 bit) & filtering quality (1 or 4 samples) in Renderer.
  352. \section Lights_ShadowCulling Shadow culling
  353. Similarly to light culling with lightmasks, shadowmasks can be used to select which objects should cast shadows with respect to each light. See \ref Drawable::SetShadowMask "SetShadowMask()". A potential shadow caster's shadow mask will be ANDed with the light's lightmask to see if it should be rendered to the light's shadow map. Also, when an object is inside a zone, its shadowmask will be ANDed with the zone's shadowmask as well. By default all bits are set in the shadowmask.
  354. For an example of shadow culling, imagine a house (which itself is a shadow caster) containing several objects inside, and a shadowed directional light shining in from the windows. In that case shadow map rendering can be avoided for objects already in shadow by clearing the respective bit from their shadowmasks.
  355. \section Lights_ShadowMapReuse Shadow map reuse
  356. The Renderer can be configured to either reuse shadow maps, or not. To reuse is the default, use \ref Renderer::SetReuseShadowMaps "SetReuseShadowMaps()" to change.
  357. When reuse is enabled, only one shadow texture of each shadow map size (full, half and quarter) needs to be reserved, and shadow maps are rendered "on the fly" before rendering a single shadowed light's contribution onto opaque geometry. This has the downside that shadow maps are no longer available during transparent geometry rendering, so transparent objects will not receive shadows.
  358. When reuse is disabled, all shadow maps are rendered before the actual scene rendering. Now multiple shadow textures need to be reserved based on the desired number of simultaneous shadow casting lights. See the function \ref Renderer::SetNumShadowMaps "SetNumShadowMaps()". If there are not enough shadow textures, they will be assigned to the closest/brightest lights, and the rest will be rendered unshadowed. Now more texture memory is needed, but the advantage is that also transparent objects can receive shadows. The exception is shadowed point lights: they need stencil masking to split into the 6 shadow map sides, which conflicts with the need to render transparent objects back-to-front, instead of rendering per light.
  359. \page Particles %Particle systems
  360. The ParticleEmitter class derives from BillboardSet to implement a particle system that updates automatically.
  361. The particle system's properties can be set through a XML description file, see \ref ParticleEmitter::LoadParameters "LoadParameters()".
  362. Most of the parameters can take either a single value, or minimum and maximum values to allow for random variation. See below for all supported parameters:s
  363. \code
  364. <particleemitter>
  365. <material name="MaterialName" />
  366. <sorting enable="true|false" />
  367. <updateinvisible enable="true|false" />
  368. <relative enable="true|false" />
  369. <emittertype value="point|box|sphere" />
  370. <emittersize value="x y z" />
  371. <direction min="x1 y1 z1" max="x2 y2 z2" />
  372. <constantforce value="x y z" />
  373. <dampingforce value="x" />
  374. <activetime value="t" />
  375. <inactivetime value="t" />
  376. <interval min="t1" max="t2" />
  377. <particlesize min="x1 y1" max="x2 y2" />
  378. <timetolive min="t1" max="t2" />
  379. <velocity min="x1" max="x2" />
  380. <rotation min="x1" max="x2" />
  381. <rotationspeed min="x1" max="x2" />
  382. <sizedelta add="x" mul="y" />
  383. <color value="r g b a" />
  384. <colorfade color="r g b a" time="t" />
  385. </particleemitter>
  386. \endcode
  387. Note: zero active or inactive time period means infinite. Instead of defining a single color element, several colorfade elements can be defined in time order to describe how the particles change color over time.
  388. \page AuxiliaryViews Auxiliary views
  389. Auxiliary views are viewports defined into a RenderSurface. These will be rendered whenever the texture containing the surface is visible, and can be typically used to implement for example reflections. The texture in question must have been created in rendertarget mode, see Texture's \ref Texture2D::SetSize "SetSize()" function.
  390. The viewport is not assigned directly to the texture because of cube map support: a cubic rendertarget has 6 render surfaces, and done this way, a different camera could be assigned to each.
  391. A "backup texture" can be assigned to the rendertarget texture: because it is illegal to sample a texture that is also being simultaneously rendered to (in cases where the rendertarget texture becomes "recursively" visible in the auxiliary view), the backup texture can be used to specify which texture should be used in place instead.
  392. Rendering detailed auxiliary views can easily have a large performance impact. Some things you can do for optimization with the auxiliary view camera:
  393. - Set the far clip distance as small as possible.
  394. - Set the camera's viewmask to for example VIEW_REFLECTION, then clear that viewmask bit from objects you don't need rendered.
  395. - Use the camera's \ref Camera::SetViewOverrideFlags "SetViewOverrideFlags()" function to disable shadows, to disable occlusion, or force the lowest material quality.
  396. \page Input %Input
  397. The Input subsystem provides keyboard and mouse input via both a polled interface and events. It is always instantiated, even in headless mode, but is active only once the application window has been created. Once active, the subsystem takes over the application mouse cursor. It will be hidden, so the UI should be used to render a software cursor if necessary.
  398. The input events include:
  399. - E_MOUSEBUTTONUP: a mouse button has been released.
  400. - E_MOUSEBUTTONDOWN: a mouse button has been pressed.
  401. - E_MOUSEMOVE: the mouse has been moved.
  402. - E_MOUSEWHEEL: the mouse wheel has been moved.
  403. - E_KEYUP: a key has been released.
  404. - E_KEYDOWN: a key has been pressed.
  405. - E_CHAR: translation of a keypress to Latin-1 charset for text entry. This is currently the only way to get translated key input.
  406. The input polling API differentiates between the initiation of a key/mouse button press, and holding the key or button down. \ref Input::GetKeyPress "GetKeyPress()" and \ref Input::GetMouseButtonPress "GetMouseButtonPress()" return true only for one frame (the initiation) while \ref Input::GetKeyDown "GetKeyDown()" and \ref Input::GetMouseButtonDown "GetMouseButtonDown()" return true as long as the key or button is held down.
  407. From the input subsystem you can also query whether the application is active/inactive, or minimized.
  408. In script, the polling API is accessed via properties: input.keyDown[], input.keyPress[], input.mouseButtonDown[], input.mouseButtonPress[], input.mouseMove.
  409. \page Audio %Audio
  410. The Audio subsystem implements an audio output stream using DirectSound. DirectSound requires a window handle, so sound can not be played back before the application window has been opened. Once playing, the following operations are supported:
  411. - Playing raw audio, Ogg Vorbis or WAV Sound resources using the SoundSource component. This allows manual stereo panning of mono sounds; stereo sounds will be output with their original stereo mix.
  412. - Playing the above sound formats in pseudo-3D using the SoundSource3D component. It has stereo positioning and distance attenuation, but does not (at least yet) filter the sound depending on the direction.
  413. For pseudo-3D positional sounds, the listener position and rotation have to be updated by calling \ref Audio::SetListenerPosition() "SetListenerPosition()" and \ref Audio::SetListenerRotation() "SetListenerRotation()".
  414. The output is software mixed for an unlimited amount of simultaneous sounds. Ogg Vorbis sounds are decoded on the fly, and decoding them can be memory- and CPU-intensive, so WAV files are recommended when a large number of short sound effects need to be played.
  415. For purposes of volume control, each SoundSource is classified into one of three categories:
  416. - %Sound effects
  417. - Music
  418. - Voice
  419. A master gain category also exists that affects the final output level. To control the category volumes, use \ref Audio::SetMasterGain "SetMasterGain()".
  420. The SoundSource components support automatic removal from the node they belong to, once playback is finished. To use, call \ref SoundSource::SetAutoRemove "SetAutoRemove()" on them. This may be useful when a game object plays several "fire and forget" sound effects.
  421. \section Audio_Parameters Sound parameters
  422. A standard WAV file can not tell whether it should loop, and raw audio does not contain any header information. Parameters for the Sound resource can optionally be specified through an XML file that has the same name as the sound, but .xml extension. Possible elements and attributes are described below:
  423. \code
  424. <sound>
  425. <format frequency="x" sixteenbit="true|false" stereo="true|false" />
  426. <loop enable="true|false" start="x" end="x" />
  427. </sound>
  428. \endcode
  429. The frequency is in Hz, and loop start and end are bytes from the start of audio data. If a loop is enabled without specifying the start and end, it is assumed to be the whole sound. Ogg Vorbis compressed sounds do not support specifying the loop range, only whether whole sound looping is enabled or disabled.
  430. The Audio subsystem is always instantiated, but in headless mode it is not active. In headless mode the playback of sounds is simulated, taking the sound length and frequency into account. This allows basing logic on whether a specific sound is still playing or not, even in server code.
  431. \page Physics Physics
  432. The %Physics library in Urho3D implements rigid body physics simulation using Open Dynamics %Engine.
  433. To use, a PhysicsWorld component must first be created to the Scene.
  434. The physics simulation has its own, fixed update rate. By default it is 60Hz. For higher rendering frame rates, physics motion is interpolated so that it always appears smooth. The update rate can be changed with \ref PhysicsWorld::SetFps "SetFps()" function. The physics update rate also determines the frequency of fixed timestep scene logic updates.
  435. The other physics components are:
  436. - CollisionShape: defines physics collision geometry. The supported shapes are sphere, box, cylinder, capsule, triangle mesh, convex hull and heightfield.
  437. - RigidBody: this component is necessary to create moving physics objects. Its parameters include mass and linear/angular velocities.
  438. - Joint: connects two RigidBodies together, or one RigidBody to a static point in the world. Currently ball and hinge joints are supported.
  439. Triangle meshes, convex hulls and heightfields are created by specifying a Model resource.
  440. Several collision shapes may exist in the same scene node to create compound shapes. An offset position and rotation relative to the node's transform can be specified for each. The shape (instead of RigidBody) also contains collision behaviour parameters: which other objects to collide with (see \ref CollisionShape::SetCollisionLayer "SetCollisionLayer()" and \ref CollisionShape::SetCollisionMask "SetCollisionMask()"), the friction coefficient, and the bounce coefficient.
  441. Note that static physics objects such as unmoving level geometry should not have RigidBody components at all.
  442. The physics simulation does all calculations in world space. Therefore nodes containing a RigidBody component should only be parented to the Scene (root node) to operate correctly. Hierarchically parented rigid bodies might be supported in the future.
  443. The physics world sends 3 types of events during its update step:
  444. - E_PHYSICSPRESTEP before the simulation is stepped.
  445. - E_PHYSICSCOLLISION (and E_NODECOLLISION to the participating scene nodes) for each collision during the simulation step.
  446. - E_PHYSICSPOSTSTEP after the simulation has been stepped.
  447. Note that if the rendering framerate is high, the physics might not be stepped at all on each frame: in that case those events will not be sent.
  448. \page UI User interface
  449. Urho3D implements a simple, hierarchical user interface system based on rectangular elements. The elements provided by default are:
  450. - BorderImage: a texture image with an optional border
  451. - Button: a pushbutton
  452. - CheckBox: a button that can be toggled on/off
  453. - Cursor: a mouse cursor
  454. - DropDownList: shows a vertical list of items (optionally scrollable) as a popup
  455. - LineEdit: a single-line text editor
  456. - ListView: shows a scrollable vertical list of items
  457. - Menu: a button which can show a popup element
  458. - ScrollBar: a slider with back and forward buttons
  459. - ScrollView: a scrollable view of child elements
  460. - Slider: a horizontal or vertical slider bar
  461. - Text: static text that can be multiline
  462. - UIElement: container for other elements, renders nothing by itself
  463. - Window: a movable and resizable window
  464. From the UI subsystem you can query the root element, which is an empty canvas (UIElement) as large as the application window, into which other elements can be added.
  465. Elements are added into each other similarly as scene nodes, using the \ref UIElement::AddChild "AddChild()" and \ref UIElement::RemoveChild "RemoveChild()" functions. Each UI element has also a \ref UIElement::GetVars "user variables" VariantMap for storing custom data.
  466. \section UI_Defining Defining UI elements in XML
  467. Each UI element knows to load its properties from an XML file. There are two distinct use cases for this: either defining just the UI element style and leaving the actual position and dimensions to be filled in later, or fully defining a set of UI elements. For an example of defining element styles, see the file Data/UI/DefaultStyle.xml.
  468. The function \ref UI::LoadLayout "LoadLayout()" in UI will take an XML file and instantiate the elements defined in it. To be valid XML, there should be one root UI element. An optional style XML file can be specified; the idea is to first read the element's style from that file, then fill in the rest from the actual layout XML file. This way the layout file can be relatively simple, as the majority of the data is already defined.
  469. The XML data for each UI element follows. Everything is optional and defaults will be used if missing. Note the redundant ways in which to define element size. Also note the element class hierarchy; for example a Button derives from BorderImage, and all elements derive from UIElement. See the comments in the elements' header files for descriptions of each property.
  470. \subsection UI_UIElement UIElement
  471. \code
  472. <element name="ElementName" type="UIElement" >
  473. <position value="x y" />
  474. <size value="x y" />
  475. <width value="x" />
  476. <height value="y" />
  477. <minsize value="x y" />
  478. <minwidth value="x" />
  479. <minheight value="y" />
  480. <maxsize value="x y" />
  481. <maxwidth value="x" />
  482. <maxheight value="y" />
  483. <fixedsize value="x y" />
  484. <fixedwidth value="x" />
  485. <fixedheight value="y" />
  486. <alignment horizontal="left|center|right" vertical="top|center|bottom" />
  487. <clipborder value="l t r b" />
  488. <priority value="p" />
  489. <opacity value="o" />
  490. <color value="r g b a" | topleft="r g b a" topright="r g b a" bottomleft="r g b a" bottomright="r g b a" />
  491. <bringtofront enable="true|false" />
  492. <bringtoback enable="true|false" />
  493. <clipchildren enable="true|false" />
  494. <enabled enable="true|false" />
  495. <selected enable="true|false" />
  496. <visible enable="true|false" />
  497. <focusmode value="notfocusable|resetfocus|focusable|focusabledefocusable" />
  498. <layout mode="free|horizontal|vertical" spacing="s" border="l t r b" />
  499. <vars>
  500. <variant name="n" type="t" value="v" />
  501. ...
  502. </vars>
  503. </element>
  504. \endcode
  505. \subsection UI_BorderImage BorderImage
  506. \code
  507. <element type="BorderImage">
  508. <texture name="TextureName" />
  509. <imagerect value="l t r b" />
  510. <border value="l t r b" />
  511. <hoveroffset value="x y" />
  512. </element>
  513. \endcode
  514. \subsection UI_Button Button
  515. \code
  516. <element type="Button">
  517. <pressedoffset value="x y" />
  518. <labeloffset value="x y" />
  519. <repeat delay="d" rate="r" />
  520. </element>
  521. \endcode
  522. \subsection UI_Checkbox Checkbox
  523. \code
  524. <element type="Checkbox">
  525. <checkedoffset value="x y" />
  526. </element>
  527. \endcode
  528. \subsection UI_Cursor Cursor
  529. \code
  530. <element type="Cursor">
  531. <shape name="normal|resizevertical|resizediagonal_topright|resizehorizontal|resizediagonal_topleft|acceptdrop|rejectdrop"
  532. texture="TextureName" imagerect="l t r b" hotspot="x y" />
  533. ...
  534. </element>
  535. \endcode
  536. \subsection UI_Menu Menu
  537. If a popup element is specified, it will be searched for by name from the UI element hierarchy.
  538. \code
  539. <element type="Menu">
  540. <popup name="ElementName" />
  541. <popupoffset value="x y" />
  542. </element>
  543. \endcode
  544. \subsection UI_Text Text
  545. \code
  546. <element type="Text">
  547. <font name="FontName" size="s" />
  548. <text value="..." />
  549. <textalignment value="left|center|right" />
  550. <rowspacing value="s" />
  551. <selection start="s" length="l" />
  552. <selectioncolor value="r g b a" />
  553. <hovercolor value="r g b a" />
  554. </element>
  555. \endcode
  556. \subsection UI_Window Window
  557. \code
  558. <element type="Window">
  559. <resizeborder value="l t r b" />
  560. <movable enable="true|false" />
  561. <resizable enable="true|false" />
  562. </element>
  563. \endcode
  564. \subsection UI_DropDownList DropDownList
  565. The styles of the listview, popup and placeholder sub-elements can be specified within the respective XML elements. The listview can be pre-filled by specifying popup items; they will be searched for by name from the UI element hierarchy.
  566. \code
  567. <element type="DropDownList">
  568. <selection value="s" />
  569. <resizepopup enable="true|false" />
  570. <listview />
  571. <popup />
  572. <placeholder />
  573. <popupitem name="ElementName" />
  574. ...
  575. </element>
  576. \endcode
  577. \subsection UI_LineEdit LineEdit
  578. The style of the cursor sub-element can specified with the "cursor" XML element.
  579. \code
  580. <element type="LineEdit">
  581. <maxlength value="l" />
  582. <cursormovable enable="true|false" />
  583. <textselectable enable="true|false" />
  584. <textcopyable enable="true|false" />
  585. <text value="..." />
  586. <cursorposition value="p" />
  587. <cursorblinkrate value="r" />
  588. <echocharacter value="c" />
  589. <cursor />
  590. </element>
  591. \endcode
  592. \subsection UI_Slider Slider
  593. The style of the knob sub-element can specified with the "knob" XML element.
  594. \code
  595. <element type="Slider">
  596. <orientation value="horizontal|vertical" />
  597. <range max="m" value="v" />
  598. <knob />
  599. </element>
  600. \endcode
  601. \subsection UI_ScrollBar ScrollBar
  602. The styles of the back button, forward button and the slider can be specified with the respective XML elements. Note the buttons' nonstandard imagerect element, which specifies the image to use for both a horizontal and a vertical button.
  603. \code
  604. <element type="ScrollBar">
  605. <orientation value="horizontal|vertical" />
  606. <range max="m" value="v" />
  607. <scrollstep value="s" />
  608. <stepfactor value="f" />
  609. <backbutton>
  610. <imagerect horizontal="l t r b" vertical="l t r b" />
  611. </backbutton>
  612. <forwardbutton>
  613. <imagerect horizontal="l t r b" vertical="l t r b" />
  614. </forwardbutton>
  615. <slider />
  616. </element>
  617. \endcode
  618. \subsection UI_ScrollView ScrollView
  619. The styles of the horizontal and vertical scrollbars can be specified with the respective XML elements. If a content element is specified, it will be searched for by name from the UI element hierarchy.
  620. \code
  621. <element type="ScrollView">
  622. <viewposition value="x y" />
  623. <scrollstep value="s" />
  624. <pagestep value="p" />
  625. <horizontalscrollbar />
  626. <verticalscrollbar />
  627. <contentelement name="ElementName" />
  628. </element>
  629. \endcode
  630. \subsection UI_ListView ListView
  631. \code
  632. <element type="ListView">
  633. <selection value="s" />
  634. <highlight value="never|focus|always" />
  635. <multiselect enable="true|false" />
  636. <hierarchy enable="true|false" />
  637. <clearselection enable="true|false" />
  638. <doubleclickinterval value="i" />
  639. </element>
  640. \endcode
  641. \section UI_Layouts UI element layout
  642. By default %UI elements operate in a "free" layout mode, where child elements' positions can be specified relative to any of the parent element corners, but they are not automatically positioned or resized.
  643. To create automatically adjusting layouts, the layout mode can be switched to either "horizontal" or "vertical". Now the child elements will be positioned left to right or top to bottom, based on the order in which they were added. They will be preferably resized to fit the parent element, taking into account their minimum and maximum sizes, but failing to do that, the parent element will be resized.
  644. Left, top, right & bottom border widths and spacing between elements can also be specified for the layout. A grid layout is not directly supported, but it can be manually created with a horizontal layout inside a vertical layout, or vice versa.
  645. \page Serialization Serialization
  646. Classes that derive from Serializable can perform automatic serialization to binary or XML format by defining \ref AttributeInfo "attributes". Attributes are stored to the Context per class. %Scene load/save and network replication are both implemented by having the Node and Component classes derive from Serializable.
  647. The supported attribute types are all those supported by Variant. Attributes can either define a direct memory offset into the object, or setter & getter functions. Zero-based enumerations are also supported, so that the enum values can be stored as text into XML files instead of just numbers. For editing, the attributes also have human-readable names.
  648. To implement side effects to attributes, for example that a Node needs to dirty its world transform whenever the local transform changes, the default attribute access functions in Serializable can be overridden. See \ref Serializable::OnSetAttribute "OnSetAttribute()" and \ref Serializable::OnGetAttribute "OnGetAttribute()".
  649. Each attribute can have a combination of the following flags:
  650. - AM_FILE: Is used for file serialization (load/save.)
  651. - AM_NET: Is used for network replication.
  652. - AM_LATESTDATA: Frequently changing data for network replication, where only the latest values matter. Used for motion and animation.
  653. - AM_NOEDIT: Is an internal attribute and is not to be shown for editing.
  654. The default flags are AM_FILE and AM_NET.
  655. \page Network Networking
  656. The Network library provides reliable and unreliable UDP messaging using kNet. A server can be created that listens for incoming connections, and client connections can be made to the server. After connecting, code running on the server can assign the client into a scene to enable scene replication, provided that when connecting, the client specified a blank scene for receiving the updates.
  657. %Scene replication is one-directional: the server always has authority and sends scene updates to the client at a fixed update rate, by default 25 FPS. The client responds by sending controls updates (buttons, yaw and pitch + possible extra data) also at a fixed rate.
  658. Bidirectional communication between the server and the client can happen either using raw network messages, which are binary-serialized data, or remote events, which operate like ordinary events, but are processed on the receiving end only. Code on the server can send messages or remote events either to one client, all clients assigned into a particular scene, or to all connected clients. In contrast the client can only send messages or remote events to the server, not directly to other clients.
  659. Note that if a particular networked application does not need scene replication, network messages (and remote events that are not targeted) can also be transmitted without assigning the client to a scene. The Chat example does just that: it does not create a scene either on the server or the client.
  660. \section Network_Connecting Connecting to a server
  661. Starting the server and connecting to it both happen through the Network subsystem. See \ref Network::StartServer "StartServer()" and \ref Network::Connect "Connect()". A UDP port must be chosen; the examples use the port 1234.
  662. Note the scene (to be used for replication) and identity VariantMap supplied as parameters when connecting. The identity data can contain for example the user name or credentials, it is completely application-specified. The identity data is sent right after connecting and causes the E_CLIENTIDENTITY event to be sent on the server when received. By subscribing to this event, server code can examine incoming connections and accept or deny them. The default is to accept all connections.
  663. After connecting successfully, client code can get the Connection object representing the server connection, see \ref Network::GetServerConnection "GetServerConnection()". Likewise, on the server a Connection object will be created for each connected client, and these can be iterated through. This object is used to send network messages or remote events to the remote peer, to assign the client into a scene (on the server only), or to disconnect.
  664. \section Network_Replication Scene replication
  665. %Network replication of scene content has been implemented in a straightforward manner, using \ref Serialization "attributes". Nodes and components that have been not been created in local mode - see the CreateMode parameter of \ref Node::CreateChild "CreateChild()" or \ref Node::CreateComponent "CreateComponent()" - will be automatically replicated. Note that a replicated component created into a local node will not be replicated, as the node's locality is checked first.
  666. The CreateMode translates into two different node and component ID ranges - replicated ID's range from 0x1 to 0xffffff, while local ID's range from 0x1000000 to 0xffffffff. This means there is a maximum of 16777215 replicated nodes or components in a scene.
  667. If the scene was originally loaded from a file on the server, the client will also load the scene from the same file first. In this case all predefined, static objects such as the world geometry should be defined as local nodes, so that they are not needlessly retransmitted through the network during the initial update, and do not exhaust the more limited replicated ID range.
  668. The server can be made to transmit needed resource \ref PackageFile "packages" to the client. This requires attaching the package files to the Scene by calling \ref Scene::AddRequiredPackageFile "AddRequiredPackageFile()". On the client, a cache directory for the packages must be chosen before receiving them is possible: see \ref Network::SetPackageCacheDir "SetPackageCacheDir()".
  669. There are some things to watch out for:
  670. - After connecting to a server, the client should not create, update or remove non-local nodes or components on its own. However, to create client-side special effects and such, the client can freely manipulate local nodes.
  671. - A node's \ref Node::GetVars "user variables" VariantMap will be automatically replicated on a per-variable basis. This can be useful in transmitting data shared by several components, for example the player's score or health.
  672. - To implement interpolation, exponential smoothing of the nodes' rendering transforms is enabled on the client. It can be controlled by two properties of the Scene, the smoothing constant and the snap threshold. Snap threshold is the distance between network updates which, if exceeded, causes the node to immediately snap to the end position, instead of moving smoothly. See \ref Scene::SetSmoothingConstant "SetSmoothingConstant()" and \ref Scene::SetSnapThreshold "SetSnapThreshold()".
  673. - Position and rotation are Node attributes, while linear and angular velocities are RigidBody attributes. To cut down on the needed network bandwidth the physics components can be created as local on the server: in this case the client will not see them at all, and will only interpolate motion based on the node's transform changes. Replicating the actual physics components allows the client to extrapolate using its own physics simulation, and to also perform collision detection, though always non-authoritatively.
  674. - AnimatedModel does not replicate animation by itself. Rather, AnimationController will replicate its command state (such as "fade this animation in, play that animation at 1.5x speed.") To turn off animation replication, create the AnimationController as local. To ensure that also the first animation update will be received correctly, always create the AnimatedModel component first, then the AnimationController.
  675. - Networked attributes can either be in delta update or latest data mode. Delta updates are small incremental changes and must be applied in order, which may cause increased latency if there is a stall in network message delivery eg. due to packet loss. High volume data such as position, rotation and velocities are transmitted as latest data, which does not need ordering, instead this mode simply discards any old data received out of order. Note that node and component creation (when initial attributes need to be sent) and removal can also be considered as delta updates and are therefore applied in order.
  676. - The server update logic orders replication messages so that parent nodes are created and updated before their children. Remote events are queued and only sent after the replication update to ensure that if they target a newly created node, it will already exist on the receiving end. However, it is also possible to specify unordered transmission for a remote event, in which case that guarantee does not hold.
  677. - Nodes have the concept of the \ref Node::SetOwner "owner connection" (for example the player that is controlling a specific game object), which can be set in server code. This property is not replicated to the client. Messages or remote events can be used instead to tell the players what object they control.
  678. - At least for now, there is no built-in client-side prediction.
  679. \section Network_InterestManagement Interest management
  680. %Scene replication includes a simple, distance-based interest management mechanism for reducing bandwidth use. To use, create the NetworkPriority component to a Node you wish to apply interest management to. The component can be created as local, as it is not important to the clients.
  681. This component has three parameters for controlling the update frequency: \ref NetworkPriority::SetBasePriority "base priority", \ref NetworkPriority::SetDistanceFactor "distance factor", and \ref NetworkPriority::SetMinPriority "minimum priority".
  682. A current priority value is calculated on each server update as "base priority - distance factor * distance." Additionally, it can never go lower than the minimum priority. This value is then added to an update accumulator. Whenever the update accumulator reaches 100.0, the attribute changes to the node and its components are sent, and the accumulator is reset.
  683. The default values are base priority 100.0, distance factor 0.0, and minimum priority 0.0. This means that by default an update is always sent (which is also the case if the node has no NetworkPriority component.) Additionally, there is a rule that the node's owner connection always receives updates at full frequency. This rule can be controlled by calling \ref NetworkPriority::SetAlwaysUpdateOwner "SetAlwaysUpdateOwner()".
  684. Calculating the distance requires the client to tell its current observer position (typically, either the camera's or the player character's world position.) This is accomplished by the client code calling \ref Connection::SetPosition "SetPosition()" on the server connection.
  685. For now, creation and removal of nodes is always sent immediately, without consulting interest management. This is based on the assumption that nodes' motion updates consume the most bandwidth.
  686. \section Network_Controls Client controls update
  687. The Controls structure is used to send controls information from the client to the server, by default also at 25 FPS. This includes held down buttons, which is an application-defined 32-bit bitfield, floating point yaw and pitch, and possible extra data (for example the currently selected weapon) stored within a VariantMap.
  688. It is up to the client code to ensure they are kept up-to-date, by calling \ref Connection::SetControls "SetControls()" on the server connection. The event E_NETWORKUPDDATE will be sent to remind of the impending update, and the event E_NETWORKUPDATESENT will be sent after the update. The controls can then be inspected on the server side by calling \ref Connection::GetControls "GetControls()".
  689. The controls update message also includes the client's observer position for interest management.
  690. \section Network_Messages Raw network messages
  691. All network messages have an integer ID. The first ID you can use for custom messages is 22 (lower ID's are either reserved for kNet's or the %Network library's internal use.) Messages can be sent either unreliably or reliably, in-order or unordered. The data payload is simply raw binary data that can be crafted by using for example VectorBuffer.
  692. To send a message to a Connection, use its \ref Connection::SendMessage "SendMessage()" function. On the server, messages can also be broadcast to all client connections by calling the \ref Network::BroadcastMessage "BroadcastMessage()" function.
  693. When a message is received, and it is not an internal protocol message, it will be forwarded as the E_NETWORKMESSAGE event. See the Chat example for details of sending and receiving.
  694. For high performance, consider using unordered messages, because for in-order messages there is only a single channel within the connection, and all previous in-order messages must arrive first before a new one can be processed.
  695. \section Network_RemoteEvents Remote events
  696. A remote event consists of its event type (name hash), a flag that tells whether it is to be sent in-order or unordered, and the event data VariantMap. It can optionally target a specific Node in the receiver's scene. This is different from ordinary events, which can optionally target any Object within the execution context.
  697. To send a remote event to a Connection, use its \ref Connection::SendRemoteEvent "SendRemoteEvent()" function. To broadcast remote events to several connections at once (server only), use Network's \ref Network::BroadcastRemoteEvent "BroadcastRemoteEvent()" function.
  698. For safety, allowed remote event types should be registered so that a client can not for example trigger an internal render update event on the server. See \ref Network::RegisterRemoteEvent "RegisterRemoteEvent()". Similarly to file paths, as long as no remote event types are registered, all are allowed.
  699. Like with ordinary events, in script event types are strings instead of name hashes for convenience.
  700. \page Multithreading Multithreading
  701. Urho3D uses a task-based multithreading model. The WorkQueue subsystem can be supplied with tasks described by the WorkItem structure, by calling \ref WorkQueue::AddWorkItem "AddWorkItem()". These will be executed in background worker threads. The function \ref WorkQueue::Complete "Complete()" will complete all currently pending tasks, and execute them also in the main thread to make them finish faster.
  702. On single-core systems no worker threads will be created. In the presence of more cores, worker threads will be created up to half of the number of CPU cores minus one that is reserved for the main thread.
  703. The work items include a function pointer to call, with the signature "void WorkFunction(const WorkItem* item, unsigned threadIndex)." The thread index ranges from 0 to n, where 0 represents the main thread and n is the number of worker threads created. Its function is to aid in splitting work into per-thread data structures that need no locking. The work item also contains three void pointers: start, end and aux, which can be used to describe a range of sub-work items, and an auxiliary data structure, which may for example be the object that originally queued the work.
  704. Multithreading is so far not exposed to scripts, and is currently used only in a limited manner: to speed up the preparation of rendering views including lit object and shadow caster queries, occlusion tests, particle system updates and animation and skinning updates. Raycasts into the Octree are also threaded, but physics raycasts are not.
  705. Note that as the Profiler currently manages only a single hierarchy tree, profiling blocks may only appear in main thread code, not in the work functions.
  706. \page Tools Tools
  707. \section Tools_AssetImporter AssetImporter
  708. Tool that loads various 3D formats supported by Open Asset Import Library (http://assimp.sourceforge.net/) and saves Urho3D model, animation, material and scene files out of them. For the list of supported formats, look at http://assimp.sourceforge.net/main_features_formats.html.
  709. Usage:
  710. \verbatim
  711. AssetImporter <command> <input file> <output file> [options]
  712. Commands:
  713. model Export a model and animations
  714. scene Export a scene and its models
  715. dumpnodes Dump scene node structure. No output file is generated
  716. lod Combine several Urho3D models as LOD levels of the output model
  717. Syntax: lod <dist0> <mdl0> <dist1> <mdl1> ... <output file>
  718. Options:
  719. -b Save scene in binary format, default format is XML
  720. -i Use local IDs for scene nodes
  721. -na Do not export animations
  722. -ne Do not create Octree & PhysicsWorld extensions to the scene
  723. -nm Do not export materials
  724. -pX Use base path X for resources in the scene file
  725. -rX Use scene node X as root node
  726. -t Generate tangents to model(s)
  727. \endverbatim
  728. \section Tools_GLShaderProcessor GLShaderProcessor
  729. GLShaderProcessor creates final GLSL source code for vertex and pixel shaders, and enumerates the possible shader combinations. Unlike \ref Tools_ShaderCompiler "ShaderCompiler", compiling the shaders is left to runtime.
  730. Usage:
  731. \verbatim
  732. GLShaderProcessor <definitionfile> <outputpath> [define1] [define2]
  733. GLSL files will be loaded from definition file directory, and finalized GLSL +
  734. XML files are saved to the output path, preserving the subdirectory structure.
  735. \endverbatim
  736. \section Tools_OgreImporter OgreImporter
  737. Tool that loads OGRE .mesh.xml and .skeleton.xml files and saves them as Urho3D .mdl (model) and .ani (animation) files. For other 3D formats and whole scene importing, see AssetImporter instead. However that tool does not handle the OGRE formats as completely as this.
  738. Usage:
  739. \verbatim
  740. OgreImporter <input file> <output file> [options]
  741. Options:
  742. -a Export animations
  743. -m Export morphs
  744. -r Export only rotations from animations
  745. -s Split each submesh into own vertex buffer
  746. -t Generate tangents
  747. \endverbatim
  748. Note: exporting only bone rotations may help when using an animation in a different model, but if bone position changes have been used for effect, the animation may become less lively. Unpredictable mutilations might result from using an animation in a model not originally intended for, as Urho3D does not specifically attempt to retarget animations.
  749. \section Tools_PackageTool PackageTool
  750. PackageTool examines a directory recursively for files and subdirectories, and creates a PackageFile. The package file can be added to the ResourceCache and used as if the files were on a (read-only) filesystem.
  751. Usage:
  752. \verbatim
  753. PackageTool <directory to process> <package name> [basepath]
  754. \endverbatim
  755. When PackageTool runs, it will go inside the source directory, then look for subdirectories and any files. Paths inside the package will by default be relative to the source directory, but if an extra path prefix is desired, it can be specified by the optional basepath argument.
  756. For example, this would convert all the resource files inside the Urho3D Data directory into a package called Data.pak (execute the command from the Bin directory)
  757. \verbatim
  758. PackageTool Data Data.pak
  759. \endverbatim
  760. \section Tools_RampGenerator RampGenerator
  761. RampGenerator creates 1D and 2D ramp textures for use in light attenuation and spotlight spot shapes.
  762. Usage:
  763. \verbatim
  764. RampGenerator <output file> <width> <power> [dimensions]
  765. \endverbatim
  766. The output is saved in PNG format. The power parameter is fed into the pow() function to determine ramp shape; higher value gives more brightness and more abrupt fade at the edge.
  767. The texconv tool from the DirectX SDK needs to be available through the system PATH.
  768. \section Tools_ShaderCompiler ShaderCompiler
  769. This tool generates HLSL shader permutations using an XML definition file that describes the permutations, and their associated HLSL preprocessor defines.
  770. The output consists of shader bytecode for each permutation, as well as information of the constant parameters and texture units used. See \ref FileFormats_Shader "Binary shader format" for details.
  771. Usage:
  772. \verbatim
  773. ShaderCompiler <definitionfile> <outputpath> [SM3] [define1] [define2] ..
  774. HLSL files will be loaded from definition file directory, and binary files will
  775. be output to the output path, preserving the subdirectory structure.
  776. \endverbatim
  777. It is possible to give additional defines from the command line. These will then be present in each permutation. SM3 is a special define which enables compilation of VS3.0 and PS3.0 code, otherwise VS2.0 and PS2.0 code is generated.
  778. The D3DX library from the DirectX runtime or SDK needs to be installed.
  779. \page FileFormats Custom file formats
  780. Urho3D tries to use existing file formats whenever possible, and define custom file formats only when absolutely necessary. Currently used custom file formats are:
  781. \section FileFormats_Model Binary model format (.mdl)
  782. \verbatim
  783. Model geometry and vertex morph data
  784. byte[4] Identifier "UMDL"
  785. uint Number of vertex buffers
  786. For each vertex buffer:
  787. uint Vertex count
  788. uint Vertex element mask (determines vertex size)
  789. uint Morphable vertex range start index
  790. uint Morphable vertex count
  791. byte[] Vertex data (vertex count * vertex size)
  792. uint Number of index buffers
  793. For each index buffer:
  794. uint Index count
  795. uint Index size (2 for 16-bit indices, 4 for 32-bit indices)
  796. byte[] Index data (index count * index size)
  797. uint Number of geometries
  798. For each geometry:
  799. uint Number of bone mapping entries
  800. uint[] Bone mapping data, Maps geometry bone indices to global bone indices for HW skinning.
  801. May be empty, in this case identity mapping will be used.
  802. uint Number of LOD levels
  803. For each LOD level:
  804. float LOD distance
  805. uint Primitive type (0 = triangle list, 1 = line list)
  806. uint Vertex buffer index, starting from 0
  807. uint Index buffer index, starting from 0
  808. uint Draw range: index start
  809. uint Draw range: index count
  810. uint Number of vertex morphs (may be 0)
  811. For each vertex morph:
  812. cstring Name of morph
  813. uint Number of affected vertex buffers
  814. For each affected vertex buffer:
  815. uint Vertex buffer index, starting from 0
  816. uint Vertex element mask for morph data. Only positions, normals & tangents are supported.
  817. uint Vertex count
  818. For each vertex:
  819. uint Vertex index
  820. Vector3 Position (if included in the mask)
  821. Vector3 Normal (if included in the mask)
  822. Vector3 Tangent (if included in the mask)
  823. Skeleton data
  824. uint Number of bones (may be 0)
  825. For each bone:
  826. cstring Bone name
  827. uint Parent bone index starting from 0. Same as own bone index for the root bone
  828. Vector3 Initial position
  829. Quaternion Initial rotation
  830. Vector3 Initial scale
  831. float[12] 4x3 offset matrix for skinning
  832. byte Bone collision info bitmask. 1 = bounding sphere 2 = bounding box
  833. If bounding sphere data included:
  834. float Bone radius
  835. If bounding box data included:
  836. Vector3 Bone bounding box minimum
  837. Vector3 Bone bounding box maximum
  838. Bounding box data
  839. Vector3 Model bounding box minimum
  840. Vector3 Model bounding box maximum
  841. Geometry center data
  842. For each geometry:
  843. Vector3 Geometry center
  844. \endverbatim
  845. \section FileFormats_Animation Binary animation format (.ani)
  846. \verbatim
  847. byte[4] Identifier "UANI"
  848. cstring Animation name
  849. float Length in seconds
  850. uint Number of tracks
  851. For each track:
  852. cstring Track name (practically same as the bone name that should be driven)
  853. byte Mask of included animation data. 1 = bone positions 2 = bone rotations 4 = bone scaling
  854. uint Number of keyframes
  855. For each keyframe:
  856. float Time position in seconds
  857. Vector3 Position (if included in data)
  858. Quaternion Rotation (if included in data)
  859. Vector3 Scale (if included in data)
  860. \endverbatim
  861. Note: animations are stored using absolute bone transformations. Therefore only lerp-blending between animations is supported; additive pose modification is not.
  862. \section FileFormats_Shader Direct3D9 binary shader format (.vs2, .ps2, .vs3, .ps3)
  863. \verbatim
  864. byte[4] Identifier "USHD"
  865. short Shader type (0 = vertex, 1 = pixel)
  866. short Shader model (2 or 3)
  867. uint Number of constant parameters
  868. For each constant parameter:
  869. cstring Parameter name
  870. uint Number of texture units
  871. For each texture unit:
  872. cstring Texture unit name
  873. uint Number of shader variations
  874. For each variation:
  875. cstring Variation name
  876. uint Number of constant parameters in use
  877. For each constant parameter in use:
  878. StringHash Parameter name hash
  879. byte Register index
  880. byte Number of registers
  881. uint Number of texture units in use
  882. For each texture unit in use:
  883. StringHash Texture unit name hash
  884. byte Sampler index
  885. uint Bytecode size
  886. byte[] Bytecode
  887. \endverbatim
  888. \page CodingConventions Coding conventions
  889. - Class and struct names are in camelcase beginning with an uppercase letter. They should be nouns. For example DebugRenderer, FreeTypeLibrary, Graphics.
  890. - Functions are likewise in upper-camelcase. For example CreateComponent, SetLinearRestThreshold.
  891. - Variables are in lower-camelcase. Member variables have an underscore appended. For example numContacts, randomSeed_.
  892. - Constants and enumerations are in uppercase. For example Vector3::ZERO or PASS_SHADOW.
  893. - Pointers and references append the * or & symbol to the type without a space in between. For example Drawable* drawable, Serializer& dest.
  894. - Class definitions proceed in the following order:
  895. - public constructors and the destructor
  896. - public virtual functions
  897. - public non-virtual member functions
  898. - public static functions
  899. - public member variables
  900. - public static variables
  901. - repeat all of the above in order for protected definitions, and finally private
  902. - Header files are commented using one-line comments beginning with /// to mark them for Doxygen.
  903. - Inline functions are defined inside the class definitions where possible, without using the inline keyword.
  904. */