Hi! I'm a software engineer, interested in how software can be used to improve thought processes.
๐ lewisbowes.com
A simulation of the launch and landing of the Falcon 9 rocket.
License: MIT License
Hi! I'm a software engineer, interested in how software can be used to improve thought processes.
๐ lewisbowes.com
Depends on #51
The final app will contain multiple simulated ridid bodies. For milestone 1, one more independent cube mesh should be added to the sim and visualisation.
Mesh
instance to the visualisationSimulation
The ImGui style should be updated.
Create a function setImGuiStyle
in Visualisation
that completely initialises ImGui with a custom style.
The FreeCamera
moves faster diagonally than it does along a single direction.
Normalise the movement vector before using it to update the camera's position.
Apply lens distortion effects to the main camera. Distortion parameters will change between cameras.
Render:
Write requirements tests for Cameras
module.
registerCam
should return trueregisterCam
should return false (and do nothing)bind
a camera that hasn't been register
ed, bind
should return false (and do nothing)bind
ing a camera correctly updates the active camera's position with that of the bound cameraThere are a few issues with the way user input is being handled around the FPVCamera
.
FPVCamera
already bound (for now)FPVCamera
should not receive any input when it does not have focusWhen the FPVCamera
is active and the mouse cursor is hidden, ImGui
input should be disabled (at the moment it is still possible to interact with ImGui
components with an invisible cursor).
Fix the bug above
The visualisation should have a static camera in some fixed position/orientation in the scene (in addition to the FPVCamera
). Eventually it will have more, but there should be at least two cameras to begin with to allow development of #6.
Cameras
moduleProcedurally generate engine exhaust plume in GLSL (see master branch shaders)
It should be possible to load a basic obj model file using tinyobjloader. This model represents an instance of static geometry.
OBJModel
classOBJModel(const char* objFilepath);
This class should have the following public functions:
void draw(glm::dvec3 camPos) const;
void setTransform(glm::dvec3 position, glm::dquat orientation);
The visualisation should contain cameras that are attached to coordinate frames and move with them (future drone camera, interstage camera etc)
MountedCamera
and register it with the CameraSystem
The visualisation should feature an animation playback panel to allow the user to adjust the time playback configuration of the animation. This system should only be concerned with modifying playback config data (see below).
PlaybackControlPanel
class that is given access to a PlaybackConfig
instance (not owned)PlaybackConfig
Add the following components to the panel:
Depends on #9
It should be possible to select different named cameras in the visualisation.
CameraSettings
panelCameras
module (selecting any will bind that camera to the current view)When the visualisation is initialised, the camera jumps to the mouse position.
Allow the user to toggle visibility of a debug layer overlay:
A simulation output file can be correctly parsed and displayed as an animated visualisation. But the physics simulation creating this output is not accurate. Collision hitboxes are wrong and the two cubes come to rest floating above the floor.
Simulation
such that the output is realisticThe state of the visualisation should be saved and loaded.
FPVCamera
should be saved and loaded between runsPlaybackConfig
should be saved and loaded between runsA StateSnapshot
should contain all information needed to render the system. The Scene
should be responsible for loading/destroying resources it uses (e.g OBJModel
s) and should manage model transform updates using information from the CameraSystem
.
Input
A StateSnapshot
Output
Fully rendered scene after call to draw()
updateState
function to the Scene
s public interface to set the transforms of meshes it owns based on information in the StateSnapshot
drawState
function to the Scene
called by the public draw
function to render everythingThe Scene
class needs to be updated.
draw()
function, responsible for drawing everything in the scene (floor plane grid, skybox, models)Update the project README
Singletons should be used for classes where there should logically only be one instance.
Convert free-functions + globals in
Visualisation
Cameras
Input
into singleton classes.
Depends on #14
The Scene
should have a skybox.
Scene
's constructordrawSkybox()
function to the Scene
class called by the main draw()
function. This should be drawn before anything else.Show telemetry output over the course of the simulation with ImPlot.
Write requirements tests for the FPVCamera
class.
Test:
FPVCamera
's handling of input from its receiving of inputSensitivity
and Movement
config information can be encapsulated in a better wayhandleInput()
function to process(Input& input)
and have it take in a blob of relevant Input
to handle (this a) makes it easier to test this class in #10 and b) separates this class from GLFW). The FPVCamera
should not care where itsSensitivity
and Movement
to unnamed structs that are static members of the FPVCamera
classWhen the FPVCamera
has focus, the mouse cursor is hidden and the user has no way to recapture it.
FPVCamera
should return the mouse cursor to the userDepends on #22
CameraBaseState
currently contains an aspectRatio
member, but this is never used because the aspect ratio used in the view matrix for rendering is always taken from the screen. In future, in combination with #22, the camera should be decoupled from the screen.
todo: operationalise this
A StateSnapshot
represents the complete state of the simulation at a single instant in time, containing all information the user is interested in. For milestone 1, this is the positions and orientations of both cubes.
Create a StateSnapshot
class containing:
glm::dvec3
)glm::dquat
)glm::dvec3
)glm::dquat
)with:
json
object (one time point in the simulation output) and parses it to correctly initialise all memberslerp
that allows a user to take in two states and linearly interpolate between them, returning the resultThe History
class needs to be constructed with output data from the simulation in order to reconstruct a model that can be queried at any point in time. This serves as the bridge between the simulation and visualisation components of the app.
Provide implementations for:
History
constructorHistory::stateAt
functionThe CameraSystem
should not be a singleton. This class should not be difficult to test.
The CameraSystem
needs small changes.
setViewTransform
to getViewTransform
and have it return the transform matrixVisualisation
call bgfx::setViewTransform()
with the matrix returned by the CameraSystem
registerCam
and bind
should return bool
s depending on whether they succeed or notDepends on #24
A navigable state history needs to be generated from simulation output (output file). Once loaded, the user should be free to move to any time point simulated to view the state of the system.
Serialised state blocks in the output file do not need to be regularly spaced in time and/or have a 1:1 relationship with StateSnapshot
s in the History
. It should be the responsibility of the History
class to take in this data and process it to produce frames that can be interpolated.
Input:
One double
representing a requested snapshot time (seconds)
Output:
A StateSnapshot
containing all relevant information about the system for that instant in time (see #24)
Make a History
class with the following public interface:
load
: nlohmann::json&
-> void
getSnapshotAt
: double
-> StateSnapshot
The visualisation application should be separate from ProjectChrono
. The simulation output file is the interface between the two applications (the user should not have to install all of ProjectChrono
just to get the visualisation running, the two are not logically connected).
Replace all references to the ProjectChrono
library with equivalent objects in glm
.
TODO: This issue should be broken down into sub-issues when this feature is implemented. This issue is closer to noting down an idea than specifying work that should be done.
The rocket physics simulation should be a separate program, running as a loopback server for control code to connect to.
Depends on #14
The visualisation should have a solid floor plane shown as a grid to provide a constant reference for camera/model positioning/movement.
drawGrid()
function to the Scene
classA declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.