glavin001 / raytracer Goto Github PK
View Code? Open in Web Editor NEWRaytracer for CSCI 4471 Computer Graphics class project
Home Page: http://glavin001.github.io/Raytracer
Raytracer for CSCI 4471 Computer Graphics class project
Home Page: http://glavin001.github.io/Raytracer
The projection window can be off-center relative to the z-axis of the camera.
Derive a subclass Transform from Object3D. Similar to a Group, a Transform will store a pointer to an Object3D (but only one, not an array). The constructor of a Transform takes a 4x4 matrix as input and a pointer to the Object3D modified by the transformation:
Transform(Matrix &m, Object3D *o);
The intersect routine will first transform the ray, then delegate to the intersect routine of the contained object. Make sure to correctly transform the resulting normal according to the rule. You may choose to normalize the direction of the transformed ray or leave it un-normalized. If you decide not to normalize the direction, you might need to update some of your intersection code.
See http://www.cs.smu.ca/~sageev/protected/graphics/assign/assignment-raytracer/RayTracer.html#depthVis
Instead of rendering the colours of objects, this is an alternate rendering to visualize the depth t of objects in the scene (i.e. distance to the nearest intersection). Two input depth values specify the range of depth values which should be mapped to shades of gray in the visualization. Depth values outside this range are simply clamped.
Allow the fog to reflect some light.
More sophisticated procedural textures, e.g. 3D marble, where the user can supply some parameters in the script file.
class Object3D
virtual bool hit()
virtual box bounding-box()
box sphere::bounding-box ()
vector3f min = center - vec3(radius, radius, radius);
vector3f max = center + vec3(radius, radius, radius);
return box(min, max);
Implement different ray-object intersection methods (e.g. 2 different ray-triangle intersection calculations; 2 different ray-sphere calculations) and compare. Compare with analytical estimate for additional credit.
Suggest ideas for other debugging tools as you progress on your project and discuss with me.
Implement a simple procedural texture, e.g. a simple polynomial or trigonometric function of the input coordinates.
When computing direction rays, if you interpolate over the camera angle, your scene will look distorted-- especially for wide camera angles, which will give the appearance of a fisheye lens.
Similar to point lights, but brightness is attenuated as a function of distance from their direction.
Rays attenuate according to their length. Allow colour of fog to be specified in the scene file.
Visualization: Pure Camera class, with subclass OrthographicCamera
For constructive solid geometry (CSG), you need to implement a new intersectAll method for your Object3D classes. This function returns all of the intersections of the ray with the object, not just the closest one.
Implement one or more additional primitive(s) of your choice, e.g. torus, cone, cylinder, or other. Note that if you implement transformations, then you only need to implement the simple axis-aligned case for any additional primitives. Note that some primitives, e.g. a torus or higher order implicit surfaces, can be implemented by solving for t with a numerical root finder. If you do this, make sure to indicate this in your final report, so that you get appropriate credit for doing so.
Discuss with me for details.
This could be sphere maps or cube maps.
Render multiple passes and blur appropriately.
We provide to load the texture and coordinates for you along with a texture class to facilitate look-up. You must interpolate texture coordinates in Triangle's intersect function, and look up texture coordinates in Material's shading function. If the material has valid texture indicated by t.Valid(), then simply use the texture colour instead of k_d. The texture colour can be retrieved by
Vector3f color = t(u,v)
where u,v is the texture coordinate.
Object3D, with subclasses Sphere, Group
Write a parser that will read obj files, consisting of a list of triangle vertices, into your triangle structure. You may use on-line source code for parsing the actual file format, so this should be a fairly simple task that will then let you import very high-resolution models into your raytracer.
Add a PerspectiveCamera class that derives from Camera. Choose your favorite internal camera representation. Similar to an orthographic camera, the scene parser provides you with the center, direction, and up vectors. But for a perspective camera, the field of view is specified with an angle (as shown in the diagram).
PerspectiveCamera(Vec3f ¢er, Vec3f &direction, Vec3f &up, float angle);
We might talk about a "virtual screen" in space. You can calculate the location and extents of this virtual screen using some simple trigonometry. You can then interpolate over points on the virtual screen in the same way you interpolated over points on the screen for the orthographic camera. Direction vectors can then be calculated by subtracting the camera center point from the screen point. Don't forget to normalize!
Note: the distance to the image plane and the size of the image plane are unnecessary. Why?
Display the absolute value of the coordinates of the normal vector as an (r, g, b) color. For example, a normal pointing in the positive or negative z direction will be displayed as pure blue (0, 0, 1). You should use black as the color for the background (undefined normal).
If surfaces are only partially reflective, not as many bounces are needed.
Automatically generate texture coordinates for primitives in addition to triangles (e.g. torus, sphere, etc).
Note that the starter code only loads limited type of image file, and the provided Mesh code only loads .obj with a single texture map.
Add a -shade_back option to your raytracer. When this option is specified, treat both sides (i.e. front and back) of your object surfaces in the same manner. This means you'll need to flip the normal when the eye is on the "wrong" side of the surface (when the dot product of the ray direction & the normal is positive). Do this normal flip just before you shade a pixel, not within the object intersection code. If the -shade_back flag is not specified, you should shade back-facing surfaces differently, to aid in debugging. Back-facing surfaces must be detected to implement refraction through translucent objects, and are often not rendered at all for efficiency in real-time applications. Demonstrate the flag being on and off with a triangle and a sphere primitive.
Parsing command line arguments & input files
See the section on parsing
If using Linux, use the provided code as a starter for command line arguments to specify input file, output image size and output file.
If using Windows, read in a secondary file that achieves achieves the same thing (i.e. specifying the input file etc).
Testing examples are provided below.
Use the input file parsing code provided to load the camera, background color and objects of the scene.
A simple scene file parser for this assignment is provided. The OrthographicCamera, Group and Sphere constructors and the Group::addObject method you will write are called from the parser. Look in the scene_parser.cpp file for details.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.