Przeglądaj źródła

Merge branch 'master' into 3.2

Rémi Verschelde 5 lat temu
rodzic
commit
223570eb13

+ 12 - 5
getting_started/scripting/c_sharp/c_sharp_features.rst

@@ -222,10 +222,12 @@ Full list of defines
 
 * One of ``GODOT_64`` or ``GODOT_32`` is defined depending on if the architecture is 64-bit or 32-bit.
 
-* One of ``GODOT_X11``, ``GODOT_WINDOWS``, ``GODOT_OSX``, ``GODOT_ANDROID``, ``GODOT_HTML5``,
-  or ``GODOT_SERVER`` depending on the OS. These names may change in the future.
-  These are created from the ``get_name()`` method of the :ref:``OS <class_OS>`` singleton,
-  but not every possible OS the method returns is an OS that Godot with Mono runs on.
+* One of ``GODOT_X11``, ``GODOT_WINDOWS``, ``GODOT_OSX``,
+  ``GODOT_ANDROID``, ``GODOT_IOS``, ``GODOT_HTML5``, or ``GODOT_SERVER``
+  depending on the OS. These names may change in the future.
+  These are created from the ``get_name()`` method of the
+  :ref:``OS <class_OS>`` singleton, but not every possible OS
+  the method returns is an OS that Godot with Mono runs on.
 
 When **exporting**, the following may also be defined depending on the export features:
 
@@ -233,6 +235,11 @@ When **exporting**, the following may also be defined depending on the export fe
 
 * One of ``GODOT_ARM64_V8A`` or ``GODOT_ARMEABI_V7A`` on Android only depending on the architecture.
 
-* One of ``GODOT_S3TC``, ``GODOT_ETC``, or ``GODOT_ETC2`` depending on the texture compression type.
+* One of ``GODOT_ARM64`` or ``GODOT_ARMV7`` on iOS only depending on the architecture.
+
+* Any of ``GODOT_S3TC``, ``GODOT_ETC``, and ``GODOT_ETC2`` depending on the texture compression type.
 
 * Any custom features added in the export menu will be capitalized and prefixed: ``foo`` -> ``GODOT_FOO``.
+
+To see an example project, see the OS testing demo:
+https://github.com/godotengine/godot-demo-projects/tree/master/misc/os_test

+ 1 - 1
getting_started/scripting/gdscript/gdscript_basics.rst

@@ -802,7 +802,7 @@ for
 
 To iterate through a range, such as an array or table, a *for* loop is
 used. When iterating over an array, the current array element is stored in
-the loop variable. When iterating over a dictionary, the *index* is stored
+the loop variable. When iterating over a dictionary, the *key* is stored
 in the loop variable.
 
 ::

+ 10 - 4
getting_started/step_by_step/your_first_game.rst

@@ -779,10 +779,16 @@ You can assign this property's value in two ways:
 - Click the down arrow next to "[empty]" and choose "Load". Select
   ``Mob.tscn``.
 
-Next, click on the Player and connect the ``hit`` signal. We want to make a
-new function named ``game_over``, which will handle what needs to happen when a
-game ends. Type "game_over" in the "Receiver Method" box at the bottom of the
-"Connect a Signal" window and click "Connect". Add the following code to the
+Next, select the ``Player`` node in the Scene dock, and access the Node dock on
+the sidebar. Make sure to have the Signals tab selected in the Node dock.
+
+You should see a list of the signals for the ``Player`` node. Find and
+double-click the ``hit`` signal in the list (or right-click it and select
+"Connect..."). This will open the signal connection dialog. We want to make
+a new function named ``game_over``, which will handle what needs to happen when
+a game ends.
+Type "game_over" in the "Receiver Method" box at the bottom of the
+signal connection dialog and click "Connect". Add the following code to the
 new function, as well as a ``new_game`` function that will set everything up
 for a new game:
 

+ 10 - 8
getting_started/workflow/best_practices/data_preferences.rst

@@ -103,11 +103,12 @@ Contiguous memory stores imply the following operation performance:
       ordered-aware search algorithm.
 
 Godot implements Dictionary as an ``OrderedHashMap<Variant, Variant>``. The engine
-stores a giant array (initialized to 1000 records) of key-value pairs. When
+stores a small array (initialized to 2^3 or 8 records) of key-value pairs. When
 one attempts to access a value, they provide it a key. It then *hashes* the
-key, i.e. converts it into a number. The "hash" becomes the index into the
-array, giving the OHM a quick lookup for the value within the conceptual
-"table" of keys mapped to values.
+key, i.e. converts it into a number. The "hash" is used to calculate the index
+into the array. As an array, the OHM then has a quick lookup within the "table"
+of keys mapped to values. When the HashMap becomes too full, it increases to
+the next power of 2 (so, 16 records, then 32, etc.) and rebuilds the structure.
 
 Hashes are to reduce the chance of a key collision. If one occurs, the table
 must recalculate another index for the value that takes the previous position
@@ -121,11 +122,12 @@ the expense of memory and some minor operational efficiency.
       too dependent on the density of the table, things will stay fast.
       Which leads to...
 
-2. Maintaining a huge size for the table.
+2. Maintaining an ever-growing size for the table.
 
-    - The reason it starts with 1000 records, and the reason it forces
-      large gaps of unused memory interspersed in the table is to
-      minimize hash collisions and maintain the speed of the accesses.
+    - HashMaps maintain gaps of unused memory interspersed in the table
+      on purpose to reduce hash collisions and maintain the speed of
+      accesses. This is why it constantly increases in size quadratically by
+      powers of 2.
 
 As one might be able to tell, Dictionaries specialize in tasks that Arrays
 do not. An overview of their operational details is as follows:

+ 3 - 2
tutorials/audio/audio_streams.rst

@@ -29,8 +29,9 @@ your use case best:
   This format works well for music, long sound effect sequences, and voice
   at relatively low bitrates.
 
-Keep in mind Ogg Vorbis files don't contain looping information. If looping an
-Ogg Vorbis file is desired, it must be set up using the import options:
+Keep in mind that while WAV files may contain looping information in their metadata,
+Ogg Vorbis files do not. If looping an Ogg Vorbis file is desired, 
+it must be set up using the import options:
 
 .. image:: img/audio_stream_import.png
 

+ 11 - 7
tutorials/plugins/android/android_plugin.rst

@@ -32,7 +32,9 @@ and capabilities that don't belong to the core feature set of a game engine:
 Android plugin
 --------------
 
-While introduced in Godot 3.2.0, the Android plugin system got a significant architecture update starting with Godot 3.2.2. The new plugin system is backward-incompatible with the previous one, but both systems are kept functional in future releases of the 3.2.x branch. Since we previously did not version the Android plugin systems, the new one is now labelled ``v1`` and is the starting point for the modern Godot Android ecosystem.
+While introduced in Godot 3.2, the Android plugin system got a significant architecture update starting with Godot 3.2.2.
+The new plugin system is backward-incompatible with the previous one, but both systems are kept functional in future releases of the 3.2.x branch.
+Since we previously did not version the Android plugin systems, the new one is now labelled ``v1`` and is the starting point for the modern Godot Android ecosystem.
 
 **Note:** In Godot 4.0, the previous system will be fully deprecated and removed.
 
@@ -45,8 +47,8 @@ with the following caveats:
 
 -  The library must include a specifically configured ``<meta-data>`` tag in its manifest file.
 
-Building a Android plugin
-^^^^^^^^^^^^^^^^^^^^^^^^^
+Building an Android plugin
+^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 **Prerequisite:** `Android Studio <https://developer.android.com/studio>`_ is strongly recommended as the IDE to use to create Android plugins.
 The instructions below assumes that you're using Android Studio.
@@ -127,8 +129,8 @@ The instructions below assumes that you're using Android Studio.
 
             -   **custom_maven_repos**: contains a list of URLs specifying the custom maven repositories required for the plugin's dependencies
 
-Loading and using a Android plugin
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+Loading and using an Android plugin
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 
 Move the plugin configuration file (e.g.: ``MyPlugin.gdap``) and, if any, its local binary (e.g.: ``MyPlugin.aar``) and dependencies to the Godot project's ``res://android/plugins`` directory.
 
@@ -147,7 +149,8 @@ From your script:
 
 Bundling GDNative resources
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
-A Android plugin can define and provide C/C++ GDNative resources, either to provide and/or access functionality from the game logic.
+
+An Android plugin can define and provide C/C++ GDNative resources, either to provide and/or access functionality from the game logic.
 The GDNative resources can be bundled within the plugin ``aar`` file which simplifies the distribution and deployment process:
 
     -   The shared libraries (``.so``) for the defined GDNative libraries will be automatically bundled by the ``aar`` build system.
@@ -161,11 +164,12 @@ At runtime, the plugin will provide these paths to Godot core which will use the
 
 Reference implementations
 ^^^^^^^^^^^^^^^^^^^^^^^^^
+
 -   `Godot Oculus Mobile plugin <https://github.com/GodotVR/godot_oculus_mobile>`_
 
     -   `Bundled gdnative resources <https://github.com/GodotVR/godot_oculus_mobile/tree/master/plugin/src/main/assets/addons/godot_ovrmobile>`_
 
--   `Godot Payment plugin <https://github.com/godotengine/godot/tree/master/platform/android/java/plugins/godotpayment>`_
+-   `Godot Google Play Billing plugin <https://github.com/godotengine/godot-google-play-billing>`_
 
 
 Troubleshooting

+ 15 - 13
tutorials/shading/shading_reference/shading_language.rst

@@ -213,33 +213,33 @@ They can be initialized at the beginning like:
 
 .. code-block:: glsl
 
-      float float_arr[3] = float[3] (1.0, 0.5, 0.0); // first constructor
+    float float_arr[3] = float[3] (1.0, 0.5, 0.0); // first constructor
 
-      int int_arr[3] = int[] (2, 1, 0); // second constructor
+    int int_arr[3] = int[] (2, 1, 0); // second constructor
 
-      vec2 vec2_arr[3] = { vec2(1.0, 1.0), vec2(0.5, 0.5), vec2(0.0, 0.0) }; // third constructor
+    vec2 vec2_arr[3] = { vec2(1.0, 1.0), vec2(0.5, 0.5), vec2(0.0, 0.0) }; // third constructor
 
-      bool bool_arr[] = { true, true, false }; // fourth constructor - size is defined automatically from the element count
+    bool bool_arr[] = { true, true, false }; // fourth constructor - size is defined automatically from the element count
 
 You can declare multiple arrays (even with different sizes) in one expression:
 
 .. code-block:: glsl
 
-      float a[3] = float[3] (1.0, 0.5, 0.0),
-       b[2] = { 1.0, 0.5 },
-       c[] = { 0.7 },
-       d = 0.0,
-       e[5];
+    float a[3] = float[3] (1.0, 0.5, 0.0),
+    b[2] = { 1.0, 0.5 },
+    c[] = { 0.7 },
+    d = 0.0,
+    e[5];
 
 To access an array element, use the indexing syntax:
 
 .. code-block:: glsl
 
-      float arr[3];
+    float arr[3];
 
-      arr[0] = 1.0; // setter
+    arr[0] = 1.0; // setter
 
-      COLOR.r = arr[0]; // getter
+    COLOR.r = arr[0]; // getter
 
 Arrays also have a built-in function ``.length()`` (not to be confused with the built-in ``length()`` function). It doesn't accept any parameters and will return the array's size.
 
@@ -250,7 +250,9 @@ Arrays also have a built-in function ``.length()`` (not to be confused with the
         // ...
     }
 
-Note: If you use an index below 0 or greater than array size - the shader will crash and break rendering. To prevent this, use ``length()``, ``if``, or ``clamp()`` functions to ensure the index is between 0 and the array's length. Always carefully test and check your code. If you pass a constant expression or a simple number, the editor will check its bounds to prevent this crash.
+.. note::
+
+    If you use an index below 0 or greater than array size - the shader will crash and break rendering. To prevent this, use ``length()``, ``if``, or ``clamp()`` functions to ensure the index is between 0 and the array's length. Always carefully test and check your code. If you pass a constant expression or a simple number, the editor will check its bounds to prevent this crash.
 
 Constants
 ---------

BIN
tutorials/vr/img/minimum_setup.png


+ 124 - 30
tutorials/vr/vr_primer.rst

@@ -5,15 +5,28 @@ AR/VR primer
 
 This tutorial gives you a springboard into the world of AR and VR in the Godot game engine.
 
-A new architecture was introduced in Godot 3 called the AR/VR Server. On top of this architecture, specific implementations are available as interfaces, most of which are plugins based on GDNative.
-This tutorial focuses purely on the core elements abstracted by the core architecture. This architecture has enough features for you to create an entire VR experience that can then be deployed for various interfaces. However, each platform often has some unique features that are impossible to abstract. Such features will be documented on the relevant interfaces and fall outside of the scope of this primer.
+A new architecture was introduced in Godot 3 called the AR/VR Server. On top of this
+architecture, specific implementations are available as interfaces, most of which are plugins
+based on GDNative. This tutorial focuses purely on the core elements abstracted by the core
+architecture. This architecture has enough features for you to create an entire VR experience
+that can then be deployed for various interfaces. However, each platform often has some unique
+features that are impossible to abstract. Such features will be documented on the relevant
+interfaces and fall outside of the scope of this primer.
 
 AR/VR server
 ------------
 
-When Godot starts, each available interface will make itself known to the AR/VR server. GDNative interfaces are setup as singletons; as long as they are added to the list of GDNative singletons in your project, they will make themselves known to the server.
+When Godot starts, each available interface will make itself known to the AR/VR server.
+GDNative interfaces are setup as singletons; as long as they are added to the list of
+GDNative singletons in your project, they will make themselves known to the server.
 
-You can use the function :ref:`get_interfaces() <class_ARVRServer_method_get_interfaces>` to return a list of available interfaces, but for this tutorial, we're going to use the :ref:`native mobile VR interface <class_MobileVRInterface>` in our examples. This interface is a straightforward implementation that uses the 3DOF sensors on your phone for orientation and outputs a stereoscopic image to the screen. It is also available in the Godot core and outputs to screen on desktop, which makes it ideal for prototyping or a tutorial such as this one.
+You can use the function :ref:`get_interfaces() <class_ARVRServer_method_get_interfaces>`
+to return a list of available interfaces, but for this tutorial, we're going to use the
+:ref:`native mobile VR interface <class_MobileVRInterface>` in our examples. This interface
+is a straightforward implementation that uses the 3DOF sensors on your phone for orientation
+and outputs a stereoscopic image to the screen. It is also available in the Godot core and
+outputs to screen on desktop, which makes it ideal for prototyping or a tutorial such as
+this one.
 
 To enable an interface, you execute the following code:
 
@@ -32,52 +45,133 @@ To enable an interface, you execute the following code:
         GetViewport().Arvr = true;
     }
 
-This code finds the interface we wish to use, initializes it and, if that is successful, binds the main viewport to the interface. This last step gives some control over the viewport to the interface, which automatically enables things like stereoscopic rendering on the viewport.
+This code finds the interface we wish to use, initializes it and, if that is successful, binds
+the main viewport to the interface. This last step gives some control over the viewport to the
+interface, which automatically enables things like stereoscopic rendering on the viewport.
 
-For our mobile VR interface, and any interface where the main input is directly displayed on screen, the main viewport needs to be the viewport where :ref:`arvr<class_Viewport_property_arvr>` is set to ``true``. But for interfaces that render on an externally attached device, you can use a secondary viewport. In the latter case, a viewport that shows its output on screen will show an undistorted version of the left eye, while showing the fully processed stereoscopic output on the device.
+For our mobile VR interface, and any interface where the main input is directly displayed on
+screen, the main viewport needs to be the viewport where :ref:`arvr<class_Viewport_property_arvr>`
+is set to ``true``. But for interfaces that render on an externally attached device, you can use
+a secondary viewport. In the latter case, a viewport that shows its output on screen will show an
+undistorted version of the left eye, while showing the fully processed stereoscopic output on the
+device.
 
-Finally, you should only initialize an interface once; switching scenes and reinitializing interfaces will just introduce a lot of overhead. If you want to turn the headset off temporarily, just disable the viewport or set :ref:`arvr<class_Viewport_property_arvr>` to ``false`` on the viewport. In most scenarios though, you wouldn't disable the headset once you're in VR, this can be disconcerting to the gamer.
+Finally, you should only initialize an interface once; switching scenes and reinitializing interfaces
+will just introduce a lot of overhead. If you want to turn the headset off temporarily, just disable
+the viewport or set :ref:`arvr<class_Viewport_property_arvr>` to ``false`` on the viewport. In most
+scenarios though, you wouldn't disable the headset once you're in VR, this can be disconcerting to
+the gamer.
 
 New AR/VR nodes
 ---------------
 
-Three new node types have been added for supporting AR and VR in Godot and one additional node type especially for AR. These are:
+Three new node types have been added for supporting AR and VR in Godot and one additional
+node type especially for AR. These are:
 
 * :ref:`ARVROrigin <class_ARVROrigin>` - our origin point in the world
 * :ref:`ARVRCamera <class_ARVRCamera>` - a special subclass of the camera, which is positionally tracked
 * :ref:`ARVRController <class_ARVRController>` - a new spatial class, which tracks the location of a controller
 * :ref:`ARVRAnchor <class_ARVRAnchor>` - an anchor point for an AR implementation mapping a real world location into your virtual world
 
-The first two must exist in your scene for AR/VR to work and this tutorial focuses purely on them.
-
-:ref:`ARVROrigin <class_ARVROrigin>` is an important node, you must have one and only one of these somewhere in your scene. This node maps the center of your real world tracking space to a location in your virtual world. Everything else is positionally tracked in relation to this point. Where this point lies exactly differs from one implementation to another, but the best example to understand how this node works is to take a look at a room scale location. While we have functions to adjust the point to center it on the player by default, the origin point will be the center location of the room you are in. As you physically walk around the room, the location of the HMD is tracked in relation to this center position and the tracking is mirror in the virtual world.
-
-To keep things simple, when you physically move around your room, the ARVR Origin point stays where it is, the position of the camera and controllers will be adjusted according to your movements.
-When you move through the virtual world, either through controller input or when you implement a teleport system, it is the position of the origin point which you will have to adjust.
-
-:ref:`ARVRCamera <class_ARVRCamera>` is the second node that must always be a part of your scene and it must always be a child node of your origin node. It is a subclass of Godot's normal camera. However, its position is automatically updated each frame based on the physical orientation and position of the HMD. Also due to the precision required for rendering to an HMD or rendering an AR overlay over a real world camera, most of the standard camera properties are ignored. The only properties of the camera that are used are the near and far plane settings. The FOV, aspect ratio and projection mode are all ignored.
-
-Note that, for our native mobile VR implementation, there is no positional tracking, only the orientation of the phone and by extension, the HMD is tracked. This implementation artificially places the camera at a height (Y) of 1.85.
+The first two must exist in your scene for AR/VR to work and this tutorial focuses purely
+on them.
+
+:ref:`ARVROrigin <class_ARVROrigin>` is an important node, you must have one and only one
+of these somewhere in your scene. This node maps the center of your real world tracking
+space to a location in your virtual world. Everything else is positionally tracked in
+relation to this point. Where this point lies exactly differs from one implementation to
+another, but the best example to understand how this node works is to take a look at a room
+scale location. While we have functions to adjust the point to center it on the player by
+default, the origin point will be the center location of the room you are in. As you
+physically walk around the room, the location of the HMD is tracked in relation to this
+center position and the tracking is mirror in the virtual world.
+
+To keep things simple, when you physically move around your room, the ARVR Origin point stays
+where it is, the position of the camera and controllers will be adjusted according to your
+movements. When you move through the virtual world, either through controller input or when
+you implement a teleport system, it is the position of the origin point which you will
+have to adjust.
+
+:ref:`ARVRCamera <class_ARVRCamera>` is the second node that must always be a part of your
+scene and it must always be a child node of your origin node. It is a subclass of Godot's
+normal camera. However, its position is automatically updated each frame based on the physical
+orientation and position of the HMD. Also due to the precision required for rendering to an
+HMD or rendering an AR overlay over a real world camera, most of the standard camera properties
+are ignored. The only properties of the camera that are used are the near and far plane
+settings. The FOV, aspect ratio and projection mode are all ignored.
+
+Note that, for our native mobile VR implementation, there is no positional tracking, only
+the orientation of the phone and by extension, the HMD is tracked. This implementation
+artificially places the camera at a height (Y) of 1.85.
 
 Conclusion: your minimum setup in your scene to make AR or VR work should look like this:
 
 .. image:: img/minimum_setup.png
 
-And that's all you need to get started. Obviously, you need to add something more into your scene, so there is something to see, but after that, you can export the game to your phone of choice, pop it into a viewer and away you go.
+And that's all you need to get started with the native mobile interface. Obviously, you need
+to add something more into your scene, so there is something to see, but after that, you can
+export the game to your phone of choice, pop it into a viewer and away you go.
 
-Other things to consider
-------------------------
+Official plugins and resources
+------------------------------
+
+As mentioned earlier, Godot does not support the various VR and AR SDKs out of the box, you
+need a plugin for the specific SDK you want to use. There are several official plugins available
+in the `GodotVR Repository <https://github.com/GodotVR>`_.
+
+* `Godot Oculus Mobile <https://github.com/GodotVR/godot_oculus_mobile>`_ provides support for
+  the Oculus Go and Oculus Quest. The Quest will require additional setup documented `here <doc_developing_for_oculus_quest>`_.
+* `Godot OpenVR <https://github.com/GodotVR/godot_openvr>`_ (not to be confused with OpenXR)
+  supports the OpenVR SDK used by Steam.
+* `Godot Oculus <https://github.com/GodotVR/godot_oculus>`_ supports the Oculus SDK
+  (desktop headsets only).
+* `Godot OpenHMD <https://github.com/GodotVR/godot_openhmd>`_ supports OpenHMD, an open source
+  API and drivers for headsets.
 
-There are a few other subjects that we need to briefly touch upon in this primer that are important to know.
+These plugins can be downloaded from GitHub or the Godot Asset Library.
 
-The first are our units. In normal 3D games, you don't have to think a lot about units. As long as everything is at the same scale, a box sized 1 unit by 1 unit by 1 unit can be any size from a cube you can hold in your hand to something the size of a building.
-In AR and VR, this changes because things in your virtual world are mapped to things in the real world. If you step 1 meter forward in the real world, but you only move 1 cm forward in your virtual world, you have a problem. The same with the position of your controllers; if they don't appear in the right relative space, it breaks the immersion for the player.
-Most VR platforms, including our AR/VR Server, assume that 1 unit = 1 meter. The AR/VR server, however, has a property that, for convenience, is also exposed on the ARVROrigin node called world scale. For instance, setting this to a value of 10 changes our coordinate system so 10 units = 1 meter.
+In addition to the plugins, there are several official demos.
 
-Performance is another thing that needs to be carefully considered. Especially VR taxes your game a lot more than most people realize. For mobile VR, you have to be extra careful here, but even for desktop games, there are three factors that make life extra difficult:
+* `Godot Oculus Demo <https://github.com/GodotVR/godot-oculus-demo>`_.
+* `Godot OpenVR FPS <https://github.com/GodotVR/godot_openvr_fps>`_ (the tutorial for this project
+  is `here <doc_vr_starter_tutorial_part_one>`_)
+* `Godot XR tools <https://github.com/GodotVR/godot-xr-tools>`_, which shows implementations for VR
+  features such as movement and picking up objects. 
 
-* You are rendering stereoscopic, two for the price of one. While not exactly doubling the work load and with things in the pipeline such as supporting the new MultiView OpenGL extension in mind, there still is an extra workload in rendering images for both eyes
-* A normal game will run acceptably on 30fps and ideally manages 60fps. That gives you a big range to play with between lower end and higher end hardware. For any HMD application of AR or VR, however, 60fps is the absolute minimum and you should target your games to run at a stable 90fps to ensure your users don't get motion sickness right off the bat.
-* The high FOV and related lens distortion effect require many VR experiences to render at double the resolution. Yes a VIVE may only have a resolution of 1080x1200 per eye, we're rendering each eye at 2160x2400 as a result. This is less of an issue for most AR applications.
+Other things to consider
+------------------------
 
-All in all, the workload your GPU has in comparison with a normal 3D game is a fair amount higher. While things are in the pipeline to improve this, such as MultiView and foveated rendering, these aren't supported on all devices. This is why you see many VR games using a more art style and if you pay close attention to those VR games that go for realism, you'll probably notice they're a bit more conservative on the effects or use some good old optical trickery.
+There are a few other subjects that we need to briefly touch upon in this primer that are important
+to know.
+
+The first are our units. In normal 3D games, you don't have to think a lot about units. As long as
+everything is at the same scale, a box sized 1 unit by 1 unit by 1 unit can be any size from a cub
+you can hold in your hand to something the size of a building. In AR and VR, this changes because
+things in your virtual world are mapped to things in the real world. If you step 1 meter forward in
+the real world, but you only move 1 cm forward in your virtual world, you have a problem. The same
+with the position of your controllers; if they don't appear in the right relative space, it breaks
+the immersion for the player. Most VR platforms, including our AR/VR Server, assume that 1 unit = 1
+meter. The AR/VR server, however, has a property that, for convenience, is also exposed on the
+ARVROrigin node called world scale. For instance, setting this to a value of 10 changes our coordinate
+system so 10 units = 1 meter.
+
+Performance is another thing that needs to be carefully considered. Especially VR taxes your game
+a lot more than most people realize. For mobile VR, you have to be extra careful here, but even for
+desktop games, there are three factors that make life extra difficult:
+
+* You are rendering stereoscopic, two for the price of one. While not exactly doubling the work load
+  and with things in the pipeline such as supporting the new MultiView OpenGL extension in mind, there
+  still is an extra workload in rendering images for both eyes
+* A normal game will run acceptably on 30fps and ideally manages 60fps. That gives you a big range to
+  play with between lower end and higher end hardware. For any HMD application of AR or VR, however,
+  60fps is the absolute minimum and you should target your games to run at a stable 90fps to ensure your
+  users don't get motion sickness right off the bat.
+* The high FOV and related lens distortion effect require many VR experiences to render at double the
+  resolution. Yes a VIVE may only have a resolution of 1080x1200 per eye, we're rendering each eye at
+  2160x2400 as a result. This is less of an issue for most AR applications.
+
+All in all, the workload your GPU has in comparison with a normal 3D game is a fair amount
+higher. While things are in the pipeline to improve this, such as MultiView and foveated rendering,
+these aren't supported on all devices. This is why you see many VR games using a more art style
+and if you pay close attention to those VR games that go for realism, you'll probably notice they're
+a bit more conservative on the effects or use some good old optical trickery.