Browse Source

fix mesa install

ell 1 year ago
parent
commit
add23a4ab0

+ 8 - 7
.github/workflows/main.yml

@@ -52,7 +52,7 @@ jobs:
     - name: Make Runnable
       run: chmod a+x love-${{ github.sha }}.AppImage
     - name: Run All Tests
-      run: xvfb-run love-${{ github.sha }}.AppImage testing
+      run: xvfb-run ./love-${{ github.sha }}.AppImage testing
     - name: Love Test Report
       uses: ellraiser/love-test-report@main
       with:
@@ -215,17 +215,18 @@ jobs:
       with:
         name: love-windows-${{ steps.vars.outputs.arch }}${{ steps.vars.outputs.compatname }}-dbg
         path: pdb/Release/*.pdb
+    - name: Install Mesa 
+      if: steps.vars.outputs.arch != 'ARM64'
+      run: |
+        curl -L --output mesa.7z --url https://github.com/pal1000/mesa-dist-win/releases/download/23.2.1/mesa3d-23.2.1-release-msvc.7z
+        7z x mesa.7z -o*
+        powershell.exe mesa\systemwidedeploy.cmd 1
     - name: Build Test Exe
       if: steps.vars.outputs.arch != 'ARM64'
       run: cmake --build build --config Release --target install
-    - name: Install Mesa 
-      if: steps.vars.outputs.arch != 'ARM64'
-      uses: ssciwr/setup-mesa-dist-win@v1
-      with:
-        version: '23.2.1'
     - name: Run All Tests
       if: steps.vars.outputs.arch != 'ARM64'
-      run: install\lovec.exe testing/main.lua
+      run: powershell.exe install/lovec.exe testing
     - name: Love Test Report
       if: steps.vars.outputs.arch != 'ARM64'
       uses: ellraiser/love-test-report@main

+ 1 - 1
testing/classes/TestModule.lua

@@ -12,7 +12,7 @@ TestModule = {
     local testmodule = {
       timer = 0,
       time = 0,
-      delay = 0.1,
+      delay = 0.01,
       spacer = '                                        ',
       colors = {
         PASS = 'green', FAIL = 'red', SKIP = 'grey'

+ 1 - 1
testing/classes/TestSuite.lua

@@ -58,7 +58,7 @@ TestSuite = {
   -- @return {nil}
   runSuite = function(self, delta)
 
-      -- stagger 0.1s between tests
+      -- stagger between tests
     if self.module ~= nil then
       self.module.timer = self.module.timer + delta
       if self.module.timer >= self.module.delay then

+ 26 - 0
testing/examples/lovetest_runAllTests.md

@@ -0,0 +1,26 @@
+<!-- PASSED 244 || FAILED 0 || SKIPPED 61 || TIME 37.853 -->
+
+**305** tests were completed in **37.853s** with **244** passed, **0** failed, and **61** skipped
+
+### Report
+| Module                | Passed | Failed | Skipped | Time   |
+| --------------------- | ------ | ------ | ------- | ------ |
+| 🟢 love.audio | 26 | 0 | 0 | 2.605s |
+| 🟢 love.data | 7 | 0 | 3 | 1.003s |
+| 🟢 love.event | 4 | 0 | 2 | 0.600s |
+| 🟢 love.filesystem | 27 | 0 | 2 | 3.030s |
+| 🟢 love.font | 4 | 0 | 1 | 0.511s |
+| 🟢 love.graphics | 81 | 0 | 15 | 10.599s |
+| 🟢 love.image | 3 | 0 | 0 | 0.299s |
+| 🟢 love.math | 17 | 0 | 0 | 1.821s |
+| 🟢 love.objects | 1 | 0 | 34 | 3.603s |
+| 🟢 love.physics | 22 | 0 | 0 | 2.222s |
+| 🟢 love.sound | 2 | 0 | 0 | 0.199s |
+| 🟢 love.system | 6 | 0 | 2 | 0.844s |
+| 🟢 love.thread | 3 | 0 | 0 | 0.318s |
+| 🟢 love.timer | 6 | 0 | 0 | 2.309s |
+| 🟢 love.video | 1 | 0 | 0 | 0.114s |
+| 🟢 love.window | 34 | 0 | 2 | 7.778s |
+
+
+### Failures

+ 7 - 9
testing/readme.md

@@ -35,9 +35,10 @@ If you want to specify only 1 specific method only you can use:
 
 All results will be printed in the console per method as PASS, FAIL, or SKIP with total assertions met on a module level and overall level.  
 
-An `XML` file in the style of [JUnit XML](https://www.ibm.com/docs/en/developer-for-zos/14.1?topic=formats-junit-xml-format) will be generated in the `/output` directory, along with a `HTML` file with a summary of all tests (including visuals for love.graphics tests) - you will need to make sure the command is run with read/write permissions for the source directory.
-> Note that this can only be viewed properly locally as the generated images are written to the save directory.   
-> An example of both types of output can be found in the `/examples` folder
+An `XML` file in the style of [JUnit XML](https://www.ibm.com/docs/en/developer-for-zos/14.1?topic=formats-junit-xml-format) will be generated in the `/output` directory, along with a `HTML` and a `Markdown` file with a summary of all tests (including visuals for love.graphics tests).  
+> An example of both types of output can be found in the `/examples` folder  
+
+The Markdown file can be used with [this github action](https://github.com/ellraiser/love-test-report) if you want to output the report results to your CI.
 
 ---
 
@@ -81,7 +82,6 @@ For sanity-checking, if it's currently not covered or we're not sure how to test
 
 ## Coverage
 This is the status of all module tests currently.  
-"objects" is a special module to cover any object specific tests, i.e. testing a File object functions as expected
 | Module                | Passed | Failed | Skipped | Time   |
 | --------------------- | ------ | ------ | ------- | ------ |
 | 🟢 love.audio | 26 | 0 | 0 | 2.602s |
@@ -92,22 +92,20 @@ This is the status of all module tests currently.
 | 🟢 love.graphics | 81 | 0 | 15 | 10.678s |
 | 🟢 love.image | 3 | 0 | 0 | 0.300s |
 | 🟢 love.math | 17 | 0 | 0 | 1.678s |
-| 🟢 love.objects | 1 | 0 | 0 | 0.121s |
 | 🟢 love.physics | 22 | 0 | 0 | 2.197s |
 | 🟢 love.sound | 2 | 0 | 0 | 0.200s |
 | 🟢 love.system | 6 | 0 | 2 | 0.802s |
 | 🟢 love.thread | 3 | 0 | 0 | 0.300s |
 | 🟢 love.timer | 6 | 0 | 0 | 2.358s |
 | 🟢 love.video | 1 | 0 | 0 | 0.100s |
-| 🟢 love.window | 34 | 0 | 2 | 8.050s |
-**271** tests were completed in **34.387s** with **244** passed, **0** failed, and **27** skipped
+| 🟢 love.window | 34 | 0 | 2 | 8.050s |  
 
 The following modules are not covered as we can't really emulate input nicely:  
 `joystick`, `keyboard`, `mouse`, and `touch`
 
 ---
 
-## Todo / Skipped
+## Todo 
 Modules with some small bits needed or needing sense checking:
 - **love.data** - packing methods need writing cos i dont really get what they are
 - **love.event** - love.event.wait or love.event.pump need writing if possible I dunno how to check
@@ -115,7 +113,7 @@ Modules with some small bits needed or needing sense checking:
 - **love.graphics** - still need to do tests for the main drawing methods
 - **love.image** - ideally isCompressed should have an example of all compressed files love can take
 - **love.math** - linearToGamma + gammaToLinear using direct formulas don't get same value back
-- **love.objects** - not started properly yet
+- **love.*.objects** - all objects tests still to be done
 - **love.graphics.setStencilTest** - deprecated, replaced by setStencilMode()
 
 ---

+ 4 - 8
testing/todo.md

@@ -3,32 +3,28 @@
 ## TESTSUITE
 - [ ] setStencilMode to replace setStencilTest
 - [ ] start graphics drawing methods
+- [ ] move object methods to respective modules
 - [ ] start object methods
 
 ## GRAPHICS
-Methods that need a better actual graphics check if possible:
+Methods that need a actual graphic pixel checks if possible:
 - [ ] setDepthMode
 - [ ] setFrontFaceWinding
 - [ ] setMeshCullMode
 
 ## FUTURE
 - [ ] need a platform: format table somewhere for compressed formats (i.e. DXT not supported)
-      could add platform as global to command and then use in tests?
 - [ ] use coroutines for the delay action? i.e. wrap each test call in coroutine 
-      and then every test can use coroutine.yield() if needed
 - [ ] could nil check some joystick and keyboard methods?
 
 ## GITHUB ACTION CI
 - [ ] linux needs to run xvfb-run with the appimage
-- [ ] windows can try installing mesa for opengl replacement
+- [ ] try vulkan on windows/linux
 - [ ] ios test run?
 
+## NOTES
 Can't run --renderers metal on github action images:
 Run love-macos/love.app/Contents/MacOS/love testing --renderers metal
 Cannot create Metal renderer: Metal is not supported on this system.
 Cannot create graphics: no supported renderer on this system.
 Error: Cannot create graphics: no supported renderer on this system.
-
-Can't run test suite on windows as it stands:
-Unable to create renderer
-This program requires a graphics card and video drivers which support OpenGL 2.1 or OpenGL ES 2.