浏览代码

Various grammar and spelling fixes

A Thousand Ships 1 月之前
父节点
当前提交
92cd36b50e
共有 49 个文件被更改,包括 84 次插入84 次删除
  1. 1 1
      contributing/development/best_practices_for_engine_contributors.rst
  2. 5 5
      contributing/development/compiling/compiling_for_macos.rst
  3. 1 1
      contributing/development/compiling/index.rst
  4. 2 2
      contributing/development/core_and_modules/2d_coordinate_systems.rst
  5. 1 1
      contributing/development/core_and_modules/binding_to_external_libraries.rst
  6. 1 1
      contributing/development/handling_compatibility_breakages.rst
  7. 1 1
      contributing/documentation/docs_image_guidelines.rst
  8. 1 1
      contributing/documentation/updating_the_class_reference.rst
  9. 1 1
      getting_started/step_by_step/scripting_player_input.rst
  10. 1 1
      tutorials/3d/lights_and_shadows.rst
  11. 1 1
      tutorials/animation/animation_tree.rst
  12. 1 1
      tutorials/animation/creating_movies.rst
  13. 3 3
      tutorials/animation/playing_videos.rst
  14. 1 1
      tutorials/best_practices/data_preferences.rst
  15. 1 1
      tutorials/best_practices/scenes_versus_scripts.rst
  16. 1 1
      tutorials/editor/command_line_tutorial.rst
  17. 1 1
      tutorials/editor/customizing_editor.rst
  18. 1 1
      tutorials/editor/index.rst
  19. 1 1
      tutorials/i18n/localization_using_gettext.rst
  20. 1 1
      tutorials/io/data_paths.rst
  21. 3 3
      tutorials/io/runtime_file_loading_and_saving.rst
  22. 1 1
      tutorials/performance/gpu_optimization.rst
  23. 1 1
      tutorials/performance/pipeline_compilations.rst
  24. 1 1
      tutorials/physics/interpolation/2d_and_3d_physics_interpolation.rst
  25. 2 2
      tutorials/physics/interpolation/physics_interpolation_introduction.rst
  26. 1 1
      tutorials/physics/using_jolt_physics.rst
  27. 1 1
      tutorials/platform/android/android_plugin.rst
  28. 1 1
      tutorials/plugins/editor/import_plugins.rst
  29. 1 1
      tutorials/rendering/jitter_stutter.rst
  30. 1 1
      tutorials/scripting/c_sharp/c_sharp_differences.rst
  31. 1 1
      tutorials/scripting/c_sharp/c_sharp_style_guide.rst
  32. 1 1
      tutorials/scripting/cpp/gdextension_cpp_example.rst
  33. 3 3
      tutorials/scripting/creating_script_templates.rst
  34. 1 1
      tutorials/scripting/debug/debugger_panel.rst
  35. 1 1
      tutorials/scripting/gdextension/gdextension_c_example.rst
  36. 1 1
      tutorials/scripting/gdscript/gdscript_basics.rst
  37. 1 1
      tutorials/scripting/gdscript/gdscript_exports.rst
  38. 1 1
      tutorials/scripting/resources.rst
  39. 3 3
      tutorials/shaders/shader_reference/shader_preprocessor.rst
  40. 2 2
      tutorials/shaders/visual_shaders.rst
  41. 2 2
      tutorials/ui/bbcode_in_richtextlabel.rst
  42. 11 11
      tutorials/xr/a_better_xr_start_script.rst
  43. 2 2
      tutorials/xr/ar_passthrough.rst
  44. 2 2
      tutorials/xr/basic_xr_locomotion.rst
  45. 5 5
      tutorials/xr/openxr_composition_layers.rst
  46. 1 1
      tutorials/xr/openxr_hand_tracking.rst
  47. 1 1
      tutorials/xr/openxr_settings.rst
  48. 2 2
      tutorials/xr/xr_action_map.rst
  49. 3 3
      tutorials/xr/xr_room_scale.rst

+ 1 - 1
contributing/development/best_practices_for_engine_contributors.rst

@@ -60,7 +60,7 @@ order to achieve greater productivity. In this case, *a solution is needed*.
 
 Believing that problems may arise in the future and that the software needs to
 be ready to solve them by the time they appear is called *"Future proofing"* and
-its characterized by lines of thought such as:
+it's characterized by lines of thought such as:
 
 - I think it would be useful for users to...
 - I think users will eventually need to...

+ 5 - 5
contributing/development/compiling/compiling_for_macos.rst

@@ -72,7 +72,7 @@ If all goes well, the resulting binary executable will be placed in the
 runs without any dependencies. Executing it will bring up the Project
 Manager.
 
-.. note:: Using a standalone editor executable is not recommended, it should be always packaged into an
+.. note:: Using a standalone editor executable is not recommended, it should be always packaged into a
           ``.app`` bundle to avoid UI activation issues.
 
 .. note:: If you want to use separate editor settings for your own Godot builds
@@ -83,7 +83,7 @@ Manager.
 Automatic ``.app`` bundle creation
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 
-To automatically create an ``.app`` bundle like in the official builds, use the ``generate_bundle=yes`` option on the *last*
+To automatically create a ``.app`` bundle like in the official builds, use the ``generate_bundle=yes`` option on the *last*
 SCons command used to build editor:
 
 ::
@@ -101,7 +101,7 @@ run the above two commands and then use ``lipo`` to bundle them together:
 
     lipo -create bin/godot.macos.editor.x86_64 bin/godot.macos.editor.arm64 -output bin/godot.macos.editor.universal
 
-To create an ``.app`` bundle, you need to use the template located in ``misc/dist/macos_tools.app``. Typically, for an optimized
+To create a ``.app`` bundle, you need to use the template located in ``misc/dist/macos_tools.app``. Typically, for an optimized
 editor binary built with ``dev_build=yes``::
 
     cp -r misc/dist/macos_tools.app ./bin/Godot.app
@@ -169,11 +169,11 @@ x86_64 architectures.
     scons platform=macos target=template_debug arch=arm64
     scons platform=macos target=template_release arch=arm64 generate_bundle=yes
 
-To create an ``.app`` bundle like in the official builds, you need to use the
+To create a ``.app`` bundle like in the official builds, you need to use the
 template located in ``misc/dist/macos_template.app``. This process can be automated by using
 the ``generate_bundle=yes`` option on the *last* SCons command used to build export templates
 (so that all binaries can be included). This option also takes care of calling ``lipo`` to create
-an *Universal 2* binary from two separate ARM64 and x86_64 binaries (if both were compiled beforehand).
+a *Universal 2* binary from two separate ARM64 and x86_64 binaries (if both were compiled beforehand).
 
 .. note::
 

+ 1 - 1
contributing/development/compiling/index.rst

@@ -7,7 +7,7 @@ Building from source
 
 .. highlight:: shell
 
-Godot prides itself on being very easy to build, by C++ projects' standards.
+Godot prides itself on being very easy to build, by C++ project standards.
 :ref:`Godot uses the SCons build system <doc_faq_why_scons>`, and after the initial
 setup compiling the engine for your current platform should be as easy as running:
 

+ 2 - 2
contributing/development/core_and_modules/2d_coordinate_systems.rst

@@ -10,7 +10,7 @@ This is a detailed overview of the available 2D coordinate systems and 2D transf
 built in. The basic concepts are covered in :ref:`doc_viewport_and_canvas_transforms`.
 
 :ref:`Transform2D <class_Transform2D>` are matrices that convert coordinates from one coordinate
-system to an other. In order to use them, it is beneficial to know which coordinate systems are
+system to another. In order to use them, it is beneficial to know which coordinate systems are
 available in Godot. For a deeper understanding, the :ref:`doc_matrices_and_transforms` tutorial
 offers insights to the underlying functionality.
 
@@ -122,7 +122,7 @@ effects of each of them.
     viewport. This transform is used for :ref:`Windows <class_Window>` as described in
     :ref:`doc_multiple_resolutions`, but can also be manually set on *SubViewports* by means of
     :ref:`size <class_SubViewport_property_size>` and
-    :ref:`size_2d_override <class_SubViewport_property_size_2d_override>`. It's
+    :ref:`size_2d_override <class_SubViewport_property_size_2d_override>`. Its
     :ref:`translation <class_Transform2D_method_get_origin>`,
     :ref:`rotation <class_Transform2D_method_get_rotation>` and
     :ref:`skew <class_Transform2D_method_get_skew>` are the default values and it can only have

+ 1 - 1
contributing/development/core_and_modules/binding_to_external_libraries.rst

@@ -167,7 +167,7 @@ environment's paths:
 
     # This is an absolute path where your .a libraries reside.
     # If using a relative path, you must convert it to a
-    # full path using an utility function, such as `Dir('...').abspath`.
+    # full path using a utility function, such as `Dir('...').abspath`.
     env.Append(LIBPATH=[Dir('libpath').abspath])
 
     # Check with the documentation of the external library to see which library

+ 1 - 1
contributing/development/handling_compatibility_breakages.rst

@@ -44,7 +44,7 @@ the code, usually placed next to ``_bind_methods()``:
         static void _bind_compatibility_methods();
     #endif
 
-They should start with a ``_`` to indicate that they are internal, and end with ``_bind_compat_`` followed by the PR number
+They should start with an ``_`` to indicate that they are internal, and end with ``_bind_compat_`` followed by the PR number
 that introduced the change (``88047`` in this example). These compatibility methods need to be implemented in a dedicated file,
 like ``core/math/a_star_grid_2d.compat.inc`` in this case:
 

+ 1 - 1
contributing/documentation/docs_image_guidelines.rst

@@ -25,7 +25,7 @@ On macOS, pressing :kbd:`Shift + Command + 3` does the same.
 To take a picture of the entire screen press :kbd:`Shift + Command + 4`.
 All screenshots taken will be saved to the desktop.
 
-Each Linux desktop environment has it's own screenshot tool. For example,
+Each Linux desktop environment has its own screenshot tool. For example,
 on KDE Plasma the program Spectacle is used for taking screenshots. If your
 distribution doesn't come with one by default try searching its package
 repository, or Flathub if that's supported.

+ 1 - 1
contributing/documentation/updating_the_class_reference.rst

@@ -72,7 +72,7 @@ If you decide to document a class, but don't know what a particular method does,
 worry. Leave it for now, and list the methods you skipped when you open a pull request
 with your changes. Another writer will take care of it.
 
-You can still look at the methods' implementation in Godot's source code on GitHub.
+You can still look at the method's implementation in Godot's source code on GitHub.
 If you have doubts, feel free to ask on the `Godot Forum <https://forum.godotengine.org/>`_
 and `Godot Contributors Chat <https://chat.godotengine.org/>`_.
 

+ 1 - 1
getting_started/step_by_step/scripting_player_input.rst

@@ -212,7 +212,7 @@ with the engine. These include ``_process()``, to apply changes to the node
 every frame, and ``_unhandled_input()``, to receive input events like key and
 button presses from the users. There are quite a few more.
 
-The ``Input`` singleton allows you to react to the players' input anywhere in
+The ``Input`` singleton allows you to react to the player's input anywhere in
 your code. In particular, you'll get to use it in the ``_process()`` loop.
 
 In the next lesson, :ref:`doc_signals`, we'll build upon the relationship between

+ 1 - 1
tutorials/3d/lights_and_shadows.rst

@@ -189,7 +189,7 @@ Every face whose front-side is hit by the light rays is lit, while the others
 stay dark. Unlike most other light types, directional lights don't have specific
 parameters.
 
-The directional light also offers a **Angular Distance** property, which
+The directional light also offers an **Angular Distance** property, which
 determines the light's angular size in degrees. Increasing this above ``0.0``
 will make shadows softer at greater distances from the caster, while also
 affecting the sun's appearance in procedural sky materials. This is called a

+ 1 - 1
tutorials/animation/animation_tree.rst

@@ -24,7 +24,7 @@ and then use an ``AnimationTree`` to control the playback.
 
 ``AnimationPlayer`` and ``AnimationTree`` can be used in both 2D and 3D scenes. When importing 3D scenes and their animations, you can use
 `name suffixes <https://docs.godotengine.org/en/stable/tutorials/assets_pipeline/importing_3d_scenes/node_type_customization.html#animation-loop-loop-cycle>`_
-to simplify the process and import with the correct properties. At the end, the imported Godot scene will contain the animations in a ``AnimationPlayer`` node.
+to simplify the process and import with the correct properties. At the end, the imported Godot scene will contain the animations in an ``AnimationPlayer`` node.
 Since you rarely use imported scenes directly in Godot (they are either instantiated or inherited from), you can place the ``AnimationTree`` node in your
 new scene which contains the imported one. Afterwards, point the ``AnimationTree`` node to the ``AnimationPlayer`` that was created in the imported scene.
 

+ 1 - 1
tutorials/animation/creating_movies.rst

@@ -169,7 +169,7 @@ to another format for viewing on the web or by Godot with the VideoStreamPlayer
 node. MJPEG does not support transparency. AVI output is currently limited to a
 file of 4 GB in size at most.
 
-To use AVI, specify a path to an ``.avi`` file to be created in the
+To use AVI, specify a path to a ``.avi`` file to be created in the
 **Editor > Movie Writer > Movie File** project setting.
 
 PNG

+ 3 - 3
tutorials/animation/playing_videos.rst

@@ -22,7 +22,7 @@ as it was too buggy and difficult to maintain.
 
 .. note::
 
-    You may find videos with an ``.ogg`` or ``.ogx`` extensions, which are generic
+    You may find videos with a ``.ogg`` or ``.ogx`` extensions, which are generic
     extensions for data within an Ogg container.
 
     Renaming these file extensions to ``.ogv`` *may* allow the videos to be
@@ -34,7 +34,7 @@ Setting up VideoStreamPlayer
 
 1. Create a VideoStreamPlayer node using the Create New Node dialog.
 2. Select the VideoStreamPlayer node in the scene tree dock, go to the inspector
-   and load an ``.ogv`` file in the Stream property.
+   and load a ``.ogv`` file in the Stream property.
 
    - If you don't have your video in Ogg Theora format yet, jump to
      :ref:`doc_playing_videos_recommended_theora_encoding_settings`.
@@ -183,7 +183,7 @@ maximize the quality of the output Ogg Theora video, but this can require a lot
 of disk space.
 
 `FFmpeg <https://ffmpeg.org/>`__ (CLI) is a popular open source tool
-for this purpose. FFmpeg has a steep learning curve, but it's powerful tool.
+for this purpose. FFmpeg has a steep learning curve, but it's a powerful tool.
 
 Here are example FFmpeg commands to convert an MP4 video to Ogg Theora. Since
 FFmpeg supports a lot of input formats, you should be able to use the commands

+ 1 - 1
tutorials/best_practices/data_preferences.rst

@@ -105,7 +105,7 @@ Contiguous memory stores imply the following operation performance:
       though. Done by re-sorting the Array after every edit and writing an
       ordered-aware search algorithm.
 
-Godot implements Dictionary as an ``HashMap<Variant, Variant, VariantHasher, StringLikeVariantComparator>``. The engine
+Godot implements Dictionary as a ``HashMap<Variant, Variant, VariantHasher, StringLikeVariantComparator>``. The engine
 stores a small array (initialized to 2^3 or 8 records) of key-value pairs. When
 one attempts to access a value, they provide it a key. It then *hashes* the
 key, i.e. converts it into a number. The "hash" is used to calculate the index

+ 1 - 1
tutorials/best_practices/scenes_versus_scripts.rst

@@ -10,7 +10,7 @@ declarative code.
 Each system's capabilities are different as a result.
 Scenes can define how an extended class initializes, but not what its
 behavior actually is. Scenes are often used in conjunction with a script,
-the scene declaring a composition of nodes, and the script adding behaviour with imperative code.
+the scene declaring a composition of nodes, and the script adding behavior with imperative code.
 
 Anonymous types
 ---------------

+ 1 - 1
tutorials/editor/command_line_tutorial.rst

@@ -16,7 +16,7 @@ suitable for this workflow.
     On Windows and Linux, you can run a Godot binary in a terminal by specifying
     its relative or absolute path.
 
-    On macOS, the process is different due to Godot being contained within an
+    On macOS, the process is different due to Godot being contained within a
     ``.app`` bundle (which is a *folder*, not a file). To run a Godot binary
     from a terminal on macOS, you have to ``cd`` to the folder where the Godot
     application bundle is located, then run ``Godot.app/Contents/MacOS/Godot``

+ 1 - 1
tutorials/editor/customizing_editor.rst

@@ -110,7 +110,7 @@ After making changes, open the **Editor** menu at the top of the editor then
 choose **Editor Layouts**. In the dropdown list, you will see a list of saved
 editor layouts, plus **Default** which is a hardcoded editor layout that can't
 be removed. The default layout matches a fresh Godot installation with no
-changes made to the docks' position and size, and no floating docks.
+changes made to the docks' positions and sizes, and no floating docks.
 
 You can remove a layout using the **Delete** option in the **Editor Layouts**
 dropdown.

+ 1 - 1
tutorials/editor/index.rst

@@ -13,7 +13,7 @@ Editor's interface
 ------------------
 
 The following pages explain how to use the various windows, workspaces, and
-docks that make up the Godot editor. We cover some specific editors' interface
+docks that make up the Godot editor. We cover some specific editors' interfaces
 in other sections where appropriate. For example, the :ref:`animation editor
 <doc_introduction_animation>`.
 

+ 1 - 1
tutorials/i18n/localization_using_gettext.rst

@@ -216,7 +216,7 @@ You can generate an MO file with the command below:
 
     msgfmt fr.po --no-hash -o fr.mo
 
-If the PO file is valid, this command will create a ``fr.mo`` file besides
+If the PO file is valid, this command will create an ``fr.mo`` file besides
 the PO file. This MO file can then be loaded in Godot as described above.
 
 The original PO file should be kept in version control so you can update

+ 1 - 1
tutorials/io/data_paths.rst

@@ -126,7 +126,7 @@ File logging can also be disabled completely using the
 ``debug/file_logging/enable_file_logging`` project setting.
 
 When the project crashes, crash logs are written to the same file as the log
-file. The crash log will only contain an usable backtrace if the binary that was
+file. The crash log will only contain a usable backtrace if the binary that was
 run contains debugging symbols, or if it can find a debug symbols file that
 matches the binary. Official binaries don't provide debugging symbols, so this
 requires a custom build to work. See

+ 3 - 3
tutorials/io/runtime_file_loading_and_saving.rst

@@ -169,7 +169,7 @@ Audio/video files
 -----------------
 
 Godot supports loading Ogg Vorbis, MP3, and WAV audio at runtime. Note that not *all*
-files with an ``.ogg`` extension are Ogg Vorbis files. Some may be Ogg Theora
+files with a ``.ogg`` extension are Ogg Vorbis files. Some may be Ogg Theora
 videos, or contain Opus audio within an Ogg container. These files will **not**
 load correctly as audio files in Godot.
 
@@ -191,7 +191,7 @@ Example of loading an Ogg Theora video file in a :ref:`class_VideoStreamPlayer`
 
     var video_stream_theora = VideoStreamTheora.new()
     # File extension is ignored, so it is possible to load Ogg Theora videos
-    # that have an `.ogg` extension this way.
+    # that have a `.ogg` extension this way.
     video_stream_theora.file = "/path/to/file.ogv"
     $VideoStreamPlayer.stream = video_stream_theora
 
@@ -203,7 +203,7 @@ Example of loading an Ogg Theora video file in a :ref:`class_VideoStreamPlayer`
 
     var videoStreamTheora = new VideoStreamTheora();
     // File extension is ignored, so it is possible to load Ogg Theora videos
-    // that have an `.ogg` extension this way.
+    // that have a `.ogg` extension this way.
     videoStreamTheora.File = "/Path/To/File.ogv";
     GetNode<VideoStreamPlayer>("VideoStreamPlayer").Stream = videoStreamTheora;
 

+ 1 - 1
tutorials/performance/gpu_optimization.rst

@@ -65,7 +65,7 @@ other). This can be done by artists, or programmatically within Godot using an a
 There is also a cost to batching together objects in 3D. Several objects
 rendered as one cannot be individually culled. An entire city that is off-screen
 will still be rendered if it is joined to a single blade of grass that is on
-screen. Thus, you should always take objects' location and culling into account
+screen. Thus, you should always take objects' locations and culling into account
 when attempting to batch 3D objects together. Despite this, the benefits of
 joining static objects often outweigh other considerations, especially for large
 numbers of distant or low-poly objects.

+ 1 - 1
tutorials/performance/pipeline_compilations.rst

@@ -134,7 +134,7 @@ the project or the environment. The pipeline precompilation system will keep
 track of these features as they're encountered for the first time and enable
 precompilation of them for any meshes or surfaces that are created afterwards.
 
-If your game makes use of these features, **make sure to have an scene that uses
+If your game makes use of these features, **make sure to have a scene that uses
 them as early as possible** before loading the majority of the assets. This
 scene can be very simple and will do the job as long as it uses the features the
 game plans to use. It can even be rendered off-screen for at least one frame if

+ 1 - 1
tutorials/physics/interpolation/2d_and_3d_physics_interpolation.rst

@@ -34,7 +34,7 @@ This has some implications:
   Controlling the on / off behavior of 2D nodes therefore requires a little more
   thought and planning.
 - On the positive side, pivot behavior in the scene tree is perfectly preserved
-  during interpolation in 2D, which gives super smooth behaviour.
+  during interpolation in 2D, which gives super smooth behavior.
 
 Resetting physics interpolation
 -------------------------------

+ 2 - 2
tutorials/physics/interpolation/physics_interpolation_introduction.rst

@@ -60,7 +60,7 @@ Adapt the tick rate?
 ~~~~~~~~~~~~~~~~~~~~
 
 Instead of designing the game at a fixed physics tick rate, we could allow the tick
-rate to scale according to the end users hardware. We could for example use a fixed
+rate to scale according to the end user's hardware. We could for example use a fixed
 tick rate that works for that hardware, or even vary the duration of each physics
 tick to match a particular frame duration.
 
@@ -70,7 +70,7 @@ run in the ``_physics_process``) work best and most consistently when run at a
 that has been designed for 60 TPS (ticks per second) at e.g. 10 TPS, the physics
 will behave completely differently. Controls may be less responsive, collisions /
 trajectories can be completely different. You may test your game thoroughly at 60
-TPS, then find it breaks on end users machines when it runs at a different tick
+TPS, then find it breaks on end users' machines when it runs at a different tick
 rate.
 
 This can make quality assurance difficult with hard to reproduce bugs, especially

+ 1 - 1
tutorials/physics/using_jolt_physics.rst

@@ -8,7 +8,7 @@ Introduction
 
 The Jolt physics engine was added as an alternative to the existing Godot Physics
 physics engine in 4.4. Jolt is developed by Jorrit Rouwe with a focus on games and
-VR applications. Previously it was available as a extension but is now built into
+VR applications. Previously it was available as an extension but is now built into
 Godot.
 
 It is important to note that the built-in Jolt Physics module is considered

+ 1 - 1
tutorials/platform/android/android_plugin.rst

@@ -138,7 +138,7 @@ Use the following steps if you have a v1 Android plugin you want to migrate to v
 
 3. After updating the Godot Android library dependency, sync or build the plugin and resolve any compile errors:
 
-    - The ``Godot`` instance provided by ``GodotPlugin::getGodot()`` no longer has access to a ``android.content.Context`` reference. Use ``GodotPlugin::getActivity()`` instead.
+    - The ``Godot`` instance provided by ``GodotPlugin::getGodot()`` no longer has access to an ``android.content.Context`` reference. Use ``GodotPlugin::getActivity()`` instead.
 
 4. Delete the ``gdap`` configuration file(s) and follow the instructions in the `Packaging a v2 Android plugin`_ section to set up the plugin configuration.
 

+ 1 - 1
tutorials/plugins/editor/import_plugins.rst

@@ -241,7 +241,7 @@ you do this you have to be careful when you add more presets.
 This is the method which defines the available options.
 :ref:`_get_import_options() <class_EditorImportPlugin_private_method__get_import_options>` returns
 an array of dictionaries, and each dictionary contains a few keys that are
-checked to customize the option as its shown to the user. The following table
+checked to customize the option as it's shown to the user. The following table
 shows the possible keys:
 
 +-------------------+------------+----------------------------------------------------------------------------------------------------------+

+ 1 - 1
tutorials/rendering/jitter_stutter.rst

@@ -243,7 +243,7 @@ done with caution.
 
     On any Godot project, you can use the ``--disable-vsync``
     :ref:`command line argument <doc_command_line_tutorial>` to forcibly disable V-Sync.
-    Since Godot 4.2, ``--max-fps <fps>`` can also be used to set a FPS limit
+    Since Godot 4.2, ``--max-fps <fps>`` can also be used to set an FPS limit
     (``0`` is unlimited). These arguments can be used at the same time.
 
 Hardware/OS-specific

+ 1 - 1
tutorials/scripting/c_sharp/c_sharp_differences.rst

@@ -3,7 +3,7 @@
 C# API differences to GDScript
 ==============================
 
-This is a (incomplete) list of API differences between C# and GDScript.
+This is an (incomplete) list of API differences between C# and GDScript.
 
 General differences
 -------------------

+ 1 - 1
tutorials/scripting/c_sharp/c_sharp_style_guide.rst

@@ -223,7 +223,7 @@ an underscore (``_``) as a prefix for private fields (but not for methods or pro
 
 .. code-block:: csharp
 
-    private Vector3 _aimingAt; // Use a `_` prefix for private fields.
+    private Vector3 _aimingAt; // Use an `_` prefix for private fields.
 
     private void Attack(float attackStrength)
     {

+ 1 - 1
tutorials/scripting/cpp/gdextension_cpp_example.rst

@@ -330,7 +330,7 @@ Compiling the plugin
 --------------------
 
 To compile the project we need to define how SCons using should compile it
-using a ``SConstruct`` file which references the one in ``godot-cpp``.
+using an ``SConstruct`` file which references the one in ``godot-cpp``.
 Writing it from scratch is outside the scope of this tutorial, but you can
 :download:`the SConstruct file we prepared <files/cpp_example/SConstruct>`.
 We'll cover a more customizable, detailed example on how to use these

+ 3 - 3
tutorials/scripting/creating_script_templates.rst

@@ -75,8 +75,8 @@ For example:
 -  ``template_scripts/Node/smooth_camera.gd``
 -  ``template_scripts/CharacterBody3D/platformer_movement.gd``
 
-Default behaviour and overriding it
------------------------------------
+Default behavior and overriding it
+----------------------------------
 
 By default:
 
@@ -89,7 +89,7 @@ By default:
 * the template will not be set as the default for the given node
 
 
-It is possible to customize this behaviour by adding meta headers at the start
+It is possible to customize this behavior by adding meta headers at the start
 of your file, like this:
 
 .. tabs::

+ 1 - 1
tutorials/scripting/debug/debugger_panel.rst

@@ -227,7 +227,7 @@ the total bandwidth usage at any given moment.
 Monitors
 --------
 
-The monitors are graphs of several aspects of the game while its running such as
+The monitors are graphs of several aspects of the game while it's running such as
 FPS, memory usage, how many nodes are in a scene and more. All monitors keep
 track of stats automatically, so even if one monitor isn't open while the game
 is running, you can open it later and see how the values changed.

+ 1 - 1
tutorials/scripting/gdextension/gdextension_c_example.rst

@@ -1802,7 +1802,7 @@ To complete this tutorial, let's see how you can register a custom signal and
 emit it when appropriate. As you might have guessed, we'll need a few more
 function pointers from the API and more helper functions.
 
-In the ``api.h`` file we're adding two things. One is a an API function to
+In the ``api.h`` file we're adding two things. One is an API function to
 register a signal, the other is a helper function to wrap the signal binding.
 
 .. code-block:: c

+ 1 - 1
tutorials/scripting/gdscript/gdscript_basics.rst

@@ -2255,7 +2255,7 @@ This is better explained through examples. Consider this scenario:
 
 There are a few things to keep in mind here:
 
-1. If the inherited class (``state.gd``) defines a ``_init`` constructor that takes
+1. If the inherited class (``state.gd``) defines an ``_init`` constructor that takes
    arguments (``e`` in this case), then the inheriting class (``idle.gd``) *must*
    define ``_init`` as well and pass appropriate parameters to ``_init`` from ``state.gd``.
 2. ``idle.gd`` can have a different number of arguments than the base class ``state.gd``.

+ 1 - 1
tutorials/scripting/gdscript/gdscript_exports.rst

@@ -483,7 +483,7 @@ annotations, you can use ``@export_custom`` instead. This allows defining any
 property hint, hint string and usage flags, with a syntax similar to the one
 used by the editor for built-in nodes.
 
-For example, this exposes the ``altitude`` property with no range limits but a
+For example, this exposes the ``altitude`` property with no range limits but an
 ``m`` (meter) suffix defined:
 
 ::

+ 1 - 1
tutorials/scripting/resources.rst

@@ -17,7 +17,7 @@ arrange user interfaces, etc. **Resources** are **data containers**. They don't
 do anything on their own: instead, nodes use the data contained in resources.
 
 Anything Godot saves or loads from disk is a resource. Be it a scene (a ``.tscn``
-or an ``.scn`` file), an image, a script... Here are some :ref:`Resource <class_Resource>` examples:
+or a ``.scn`` file), an image, a script... Here are some :ref:`Resource <class_Resource>` examples:
 
 - :ref:`Texture <class_Texture>`
 - :ref:`Script <class_Script>`

+ 3 - 3
tutorials/shaders/shader_reference/shader_preprocessor.rst

@@ -148,7 +148,7 @@ to a non-zero value, the code block is included, otherwise it is skipped.
 To evaluate correctly, the condition must be an expression giving a simple
 floating-point, integer or boolean result. There may be multiple condition
 blocks connected by ``&&`` (AND) or ``||`` (OR) operators. It may be continued
-by a ``#else`` block, but **must** be ended with the ``#endif`` directive.
+by an ``#else`` block, but **must** be ended with the ``#endif`` directive.
 
 .. code-block:: glsl
 
@@ -163,7 +163,7 @@ by a ``#else`` block, but **must** be ended with the ``#endif`` directive.
 Using the ``defined()`` *preprocessor function*, you can check whether the
 passed identifier is defined a by ``#define`` placed above that directive. This
 is useful for creating multiple shader versions in the same file. It may be
-continued by a ``#else`` block, but must be ended with the ``#endif`` directive.
+continued by an ``#else`` block, but must be ended with the ``#endif`` directive.
 
 The ``defined()`` function's result can be negated by using the ``!`` (boolean NOT)
 symbol in front of it. This can be used to check whether a define is *not* set.
@@ -273,7 +273,7 @@ Like with ``#if``, the ``defined()`` preprocessor function can be used:
 
 This is a shorthand for ``#if defined(...)``. Checks whether the passed
 identifier is defined by ``#define`` placed above that directive. This is useful
-for creating multiple shader versions in the same file. It may be continued by a
+for creating multiple shader versions in the same file. It may be continued by an
 ``#else`` block, but must be ended with the ``#endif`` directive.
 
 .. code-block:: glsl

+ 2 - 2
tutorials/shaders/visual_shaders.rst

@@ -239,9 +239,9 @@ only available for shaders that are in ``Particles`` mode.
 
 Keep in mind that not all 3D objects are mesh files. a glTF file can't be dragged
 and dropped into the graph. However, you can create an inherited scene from it,
-save the mesh in that scene as it's own file, and use that.
+save the mesh in that scene as its own file, and use that.
 
 .. image:: img/vs_meshemitter.webp
 
 You can also drag and drop obj files into the graph editor to add the node
-for that specific mesh, other mesh files will not work for this.
+for that specific mesh, other mesh files will not work for this.

+ 2 - 2
tutorials/ui/bbcode_in_richtextlabel.rst

@@ -599,7 +599,7 @@ clicked URLs using the user's default web browser:
         # to avoid script errors at runtime.
         OS.shell_open(str(meta))
 
-For more advanced use cases, it's also possible to store JSON in an ``[url]``
+For more advanced use cases, it's also possible to store JSON in a ``[url]``
 tag's option and parse it in the function that handles the ``meta_clicked`` signal.
 For example:
 
@@ -1001,7 +1001,7 @@ All examples below mention the default values for options in the listed tag form
 
 .. note::
 
-    Text effects that move characters' position may result in characters being
+    Text effects that move characters' positions may result in characters being
     clipped by the RichTextLabel node bounds.
 
     You can resolve this by disabling **Control > Layout > Clip Contents** in

+ 11 - 11
tutorials/xr/a_better_xr_start_script.rst

@@ -21,7 +21,7 @@ We are introducing 3 signals to our script so that our game can add further logi
 
 - ``focus_lost`` is emitted when the player takes off their headset or when the player enters the menu system of the headset.
 - ``focus_gained`` is emitted when the player puts their headset back on or exits the menu system and returns to the game.
-- ``pose_recentered`` is emitted when the headset requests the players position to be reset.
+- ``pose_recentered`` is emitted when the headset requests the player's position to be reset.
 
 Our game should react accordingly to these signals.
 
@@ -93,7 +93,7 @@ Our updated ready function
 
 We add a few things to the ready function.
 
-If we're using the mobile or forward+ renderer we set the viewports ``vrs_mode`` to ``VRS_XR``.
+If we're using the mobile or forward+ renderer we set the viewport's ``vrs_mode`` to ``VRS_XR``.
 On platforms that support this, this will enable foveated rendering.
 
 If we're using the compatibility renderer, we check if the OpenXR foveated rendering settings
@@ -199,7 +199,7 @@ This signal is emitted by OpenXR when our session is setup.
 This means the headset has run through setting everything up and is ready to begin receiving content from us.
 Only at this time various information is properly available.
 
-The main thing we do here is to check our headsets refresh rate.
+The main thing we do here is to check our headset's refresh rate.
 We also check the available refresh rates reported by the XR runtime to determine if we want to set our headset to a higher refresh rate.
 
 Finally we match our physics update rate to our headset update rate.
@@ -297,13 +297,13 @@ Not matching the physics update rate will cause stuttering as frames are rendere
 On visible state
 ----------------
 
-This signal is emitted by OpenXR when our game becomes visible but is not focussed.
+This signal is emitted by OpenXR when our game becomes visible but is not focused.
 This is a bit of a weird description in OpenXR but it basically means that our game has just started
-and we're about to switch to the focussed state next,
-that the user has opened a system menu or the users has just took their headset off.
+and we're about to switch to the focused state next,
+that the user has opened a system menu or the user has just took their headset off.
 
-On receiving this signal we'll update our focussed state,
-we'll change the process mode of our node to disabled which will pause processing on this node and it's children,
+On receiving this signal we'll update our focused state,
+we'll change the process mode of our node to disabled which will pause processing on this node and its children,
 and emit our ``focus_lost`` signal.
 
 If you've added this script to your root node,
@@ -377,12 +377,12 @@ the game stays in 'visible' state until the user puts their headset on.
 
   It is thus important to keep your game paused while in visible mode.
   If you don't the game will keep on running while your user isn't interacting with your game.
-  Also when the game returns to focussed mode,
+  Also when the game returns to the focused mode,
   suddenly all controller and hand tracking is re-enabled and could have game breaking consequences
   if you do not react to this accordingly.
-  Be sure to test this behaviour in your game!
+  Be sure to test this behavior in your game!
 
-While handling our signal we will update the focusses state, unpause our node and emit our ``focus_gained`` signal.
+While handling our signal we will update the focuses state, unpause our node and emit our ``focus_gained`` signal.
 
 .. tabs::
   .. code-tab:: gdscript GDScript

+ 2 - 2
tutorials/xr/ar_passthrough.rst

@@ -162,9 +162,9 @@ This has two consequences:
   objects cast shadows on real world objects [#]_.
 
 .. figure:: img/xr_passthrough_example.webp
-    :alt: Image showing shadow to opacity being used to show the users desk.
+    :alt: Image showing shadow to opacity being used to show the user's desk.
 
-    Image showing shadow to opacity being used to show the users desk.
+    Image showing shadow to opacity being used to show the user's desk.
 
 This enabled the following use cases:
 

+ 2 - 2
tutorials/xr/basic_xr_locomotion.rst

@@ -24,7 +24,7 @@ So to prevent our player from infinitely falling down we'll quickly add a floor
 
 We start by adding a :ref:`StaticBody3D <class_staticbody3d>` node to our root node and we rename this to ``Floor``.
 We add a :ref:`MeshInstance3D <class_meshinstance3d>` node as a child node for our ``Floor``. 
-Then create a new :ref:`PlaneMesh <class_planemesh>` as it's mesh.
+Then create a new :ref:`PlaneMesh <class_planemesh>` as its mesh.
 For now we set the size of the mesh to 100 x 100 meters.
 Next we add a :ref:`CollisionShape3D <class_collisionshape3d>` node as a child node for our ``Floor``.
 Then create a ``BoxShape`` as our shape.
@@ -76,7 +76,7 @@ Godot XR Tools supports this through the teleport function and we will be adding
 Add a new child scene to your left hand :ref:`XRController3D <class_xrcontroller3d>` node by selecting the ``addons/godot-xr-tools/functions/function_teleport.tscn`` scene.
 
 With this scene added the player will be able to teleport around the world by pressing the trigger on the left hand controller, pointing where they want to go, and then releasing the trigger.
-The player can also adjust the orientation by using the left hand controllers joystick.
+The player can also adjust the orientation by using the left hand controller's joystick.
 
 If you've followed all instructions correctly your scene should now look something like this:
 

+ 5 - 5
tutorials/xr/openxr_composition_layers.rst

@@ -142,7 +142,7 @@ This implementation only works for our ``OpenXRCompositionLayerQuad`` node.
     ...
 
 We also define a helper function that takes our ``intersect`` value and
-returns our location in the viewports local coordinate system:
+returns our location in the viewport's local coordinate system:
 
 .. code:: gdscript
 
@@ -195,7 +195,7 @@ If so, we check if our button is pressed and place our pointer at our intersecti
     ...
 
 If we were intersecting in our previous process call and our pointer has moved,
-we prepare a :ref:`InputEventMouseMotion <class_InputEventMouseMotion>` object
+we prepare an :ref:`InputEventMouseMotion <class_InputEventMouseMotion>` object
 to simulate our mouse moving and send that to our viewport for further processing.
 
 .. code:: gdscript
@@ -216,7 +216,7 @@ to simulate our mouse moving and send that to our viewport for further processin
     ...
 
 If we've just released our button we also prepare
-a :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
+an :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
 to simulate a button release and send that to our viewport for further processing.
 
 .. code:: gdscript
@@ -234,7 +234,7 @@ to simulate a button release and send that to our viewport for further processin
     ...
 
 Or if we've just pressed our button we prepare
-a :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
+an :ref:`InputEventMouseButton <class_InputEventMouseButton>` object
 to simulate a button press and send that to our viewport for further processing.
 
 .. code:: gdscript
@@ -292,5 +292,5 @@ the XR compositor will now draw the viewport first, and then overlay our renderi
 .. figure:: img/openxr_composition_layer_hole_punch.webp
    :align: center
 
-   Use case showing how the users hand is incorrectly obscured
+   Use case showing how the user's hand is incorrectly obscured
    by a composition layer when hole punching is not used.

+ 1 - 1
tutorials/xr/openxr_hand_tracking.rst

@@ -135,7 +135,7 @@ We also need to enable editable children to gain access to our :ref:`Skeleton3D
 The hand skeleton modifier
 ~~~~~~~~~~~~~~~~~~~~~~~~~~
 
-Finally we need to add a :ref:`XRHandModifier3D <class_xrhandmodifier3d>` node as a child to our ``Skeleton3D`` node.
+Finally we need to add an :ref:`XRHandModifier3D <class_xrhandmodifier3d>` node as a child to our ``Skeleton3D`` node.
 This node will obtain the finger tracking data from OpenXR and apply it the hand model.
 
 You need to set the ``Hand Tracker`` property to either ``/user/hand_tracker/left`` or ``/user/hand_tracker/right``

+ 1 - 1
tutorials/xr/openxr_settings.rst

@@ -175,7 +175,7 @@ Sets the foveation level used when rendering provided this feature is supported
 Foveation is a technique where the further away from the center of the viewport we render content, the lower resolution we render at.
 Most XR runtimes only support fixed foveation, but some will take eye tracking into account and use the focal point for this effect.
 
-The higher the level, the better the performance gains, but also the more reduction in quality there is in the users peripheral vision.
+The higher the level, the better the performance gains, but also the more reduction in quality there is in the user's peripheral vision.
 
 .. Note::
   **Compatibility renderer only**,

+ 2 - 2
tutorials/xr/xr_action_map.rst

@@ -150,7 +150,7 @@ The columns in our table are as follows:
   * - 3
     - 0
     - This is the priority of the action set.
-      If multiple active action sets have actions bound to the same controllers inputs or
+      If multiple active action sets have actions bound to the same controller's inputs or
       outputs, the action set with the highest priority value will determine the action
       that is updated.
 
@@ -572,7 +572,7 @@ These settings are used as follows:
   * ``On Haptic`` lets us define a haptic output that is automatically activated
     when an action becomes pressed.
   * ``Off Haptic`` lets us define a haptic output that is automatically activated
-    when a action is released.
+    when an action is released.
 
 
 Binding modifiers on individual bindings

+ 3 - 3
tutorials/xr/xr_room_scale.rst

@@ -28,7 +28,7 @@ The movement through controller input, and the physical movement of the player i
 
 As a result, the origin node does not represent the position of the player.
 It represents the center, or start of, the tracking space in which the player can physically move.
-As the player moves around their room this movement is represented through the tracking of the players headset.
+As the player moves around their room this movement is represented through the tracking of the player's headset.
 In game this translates to the camera node's position being updated accordingly.
 For all intents and purposes, we are tracking a disembodied head.
 Unless body tracking is available, we have no knowledge of the position or orientation of the player's body.
@@ -229,7 +229,7 @@ In this approach step 1 is where all the magic happens.
 Just like with our previous approach we will be applying our physical movement to the character body,
 but we will counter that movement on the origin node.
 
-This will ensure that the players location stays in sync with the character body's location.
+This will ensure that the player's location stays in sync with the character body's location.
 
 .. code-block:: gdscript
 
@@ -377,7 +377,7 @@ The problem with this approach is that physical movement is now not replicated i
 This will cause nausea for the player.
 
 What many XR games do instead, is to measure the distance between where the player physically is,
-and where the players virtual body has been left behind.
+and where the player's virtual body has been left behind.
 As this distance increases, usually to a distance of a few centimeters, the screen slowly blacks out.
 
 Our solutions up above would allow us to add this logic into the code at the end of step 1.