| 123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081 |
- TODO:
- - Core thread gets stuck on shutdown when OpenGL is used...Somewhere in kernel
- CONCRETE TASK:
- - SceneView editor window: Add a way to detect exact mouse position on the render texture
- - Similar to how I have onRenderViewport callback in Renderer have another one that gets triggered from core thread
- - Hook up gizmo rendering there
- - Hook up gizmo manager to ScenePicking so gizmos are considered when picking
- - I'll likely need to update GizmoManager so I can query gizmo SceneObject based on gizmo index
- - Selection/ScenePicking/GizmoManager need to be started
- IMMEDIATE:
- - SceneGrid is very ugly. Consider using default lines for now and come back with a better approach later.
- - Potentially enable line AA?
- - Picking code is completely untested and will likely need major fixing
- - Disable DX9 for editor as I will likely want to use geometry shaders for icon rendering, and possibly new AA line shader
- - Or just use MeshHeap and update the icon/lines every frame?
- - Test all the new DrawHelper3D methods
- GIZMO TODO:
- - IMPORTANT: Gizmo rendering happens in update() but it should happen whenever scene view is being rendered as the render target isn't set anywhere
- - Figure out how to deal with builtin components like Camera and Renderable (e.g. how will they have gizmos since they're not managed components?)
- LATER:
- - Need a way to render text for gizmos and handles, and in scene in general
- ----------------------------------------------------------------------
- Handles
- - Make a few different base handle types:
- - Slider 1D (e.g. for movement using an arrow cap)
- - Slider 2D (e.g. for movement in 2D using a plane render)
- - Similar for scale/rotation handles (see Unity for its implementations of those)
- Handles should have colliders which will be queries whenever user input is detected in scene view
- If any handle is hit the input will not proceed further (e.g. no picking will be done) and that handle control
- will become active.
- Base handle types should just be positioned in space and then return value every frame as user moves them.
- - This way they can also easily be used from C# for custom user-made stuff
- TODO - Think about this
- See for inspiration: http://docs.unity3d.com/ScriptReference/Handles.html
- ----------------------------------------------------------------------
- SelectionRenderer
- Retrieve a list of selected objects from SelectionManager
- Find ones with Renderable components
- Retrieve Meshes, and world transforms from them
- Draw that same mesh with either a wireframe or a grayed out shader with a slight depth bias
- ----------------------------------------------------------------------
- SceneView editor flow:
- Hook up gizmo, handle and selection rendering methods to be executed after the scene is rendered
- Calculate mouse coords manually relative to the window and to the render texture GUI element
- - Don't use GUI events as we require more precise control (do we?)
- Detect mouse clicks on the scene render target
- Forward those mouse coordinates to HandleManager
- It checks if screen ray intersects any handles and returns the handle if it does
- If handle is found it is activated and method returns
- Otherwise we mark the coordinates as selection start
- Detect mouse drag on the scene render target
- - If we have an active handle
- Forward mouse coordinates to the active handle so it can do its thing
- return
- - Otherwise its assumed we are dragging a selection
- Update selection endpoint and send it to ScenePicking
- Use Selection to select picked objects if any
- return
- Detect mouse release on scene render target
- If we have an active handle
- Clear active handle
- return
- Otheriwse its assumed we are dragging a selection
- Do nothing
- return
|