Giter VIP home page Giter VIP logo

cg's People

Contributors

arvigj avatar danielepanozzo avatar jcdai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cg's Issues

Assignment 1 - Cross product?

I am working on assignment 1, and on the Wikipedia page for Graham Scan, it said one could determine whether a point would be a left turn or a right turn from the current line by taking the cross product of the lines formed and observing the sign. First of all, is this correct? If it is, what is the intuition behind this, and is this related to using the determinant? I would like to use the determinant, but I am not sure how to take the determinant in reference to a line rather than just taking the determinant between two points.

Wrong object position observed

When I was doing ray tracing with perspective projection on the parallelogram, I set the center of my parallelogram at (0, 0, 0), the ray origin at (0, 0, 1.5), and the central pixel at (0, 0, 1). Under this setting I assume the result parallelogram should also be in the center of the picture. However, it is near the left bottom side, could anyone please give me some ideas about what may cause this problem happen?

Partial Credit for Assignment 3

If we do only some parts of assignment 3, but not all of it, could we get partial credit?
For example if we only have time to implement shadows and field of view, can we get 5% (or whatever the percentages work out to be) of extra credit?

Assignment3 result image color

In the document the result image does not have RGB, however my result does have color. Could you please provide a colored result image under the default setting so that I can check if my implementation is correct?

Sensor, Camera, and Pixel Grid

Hi Professor,

I'm confused with those terminologies: sensor, camera, pixel grid

in Assignment 3 Exercise 1
image
According to the README,

The field of view of a perspective camera represents the angle formed between the center of the camera and the sensor (aka the pixel grid through which rays are shot)

so I'm assuming 1 is the camera, 2 is sensor == pixel grid

but in the code annotation, you are saying

The pixel grid through which we shoot rays is at a distance 'focal_length' from the sensor

That makes sensor and pixel grid sounds like different things, and in the above image, 1 will be camera == sensor, 2 will be the pixel grid.

I understand there is a sensor inside the camera, so when we are talking about camera, are we seeing it as a point, or a w*h sensor pixel grid?

Furthermore, what these three terminologies sensor, camera, pixel grid usually mean in computer graphic, and what's their relationship?

Would you mind clarifying a little bit more?

Really appreciate that!

x_displacement and y_displacement

Hi,

I have difficulties for understanding what x_displacement(2.0/C.cols(),0,0) and y_displacement(0,-2.0/C.rows(),0) does in the code, specifically where the 2.0 and -2.0 come from?

and I saw this formula on the textbook, what's the l, r, t, b in our project and why there has 0.5?
image

Can anyone please help me? thank you!

What exactly have to be written?

Do we need to describe what each function is doing and why?
For example: Why a determinant is computed and how to compute it? Or why sorting of points is done, what is the comparing function that is used? etc.

Or should it be like a pseudo code of the implementation describing in a detailed manner what is done by each of the lines of a block of code?

How much deeper do we have to delve when writing the description about the implementation of an algorithm?

Is there an existing template that can be used as reference?

Assignment 2 Inquiry

For Assignment 2 I have the following questions.

  1. For Ex.1: Basic Ray Tracing, are we only adding code to raytrace_parallelogram() (for orthographic view) and raytrace_perspective() (for perspective view) for this implementation?

  2. For comparing the difference in the result for a sphere and a parallelogram, can we implement a scene with both of them and compare their results both in shading as well as their difference in orthographic and perspective view?

  3. For Ex2. Shading, are we only playing around with the coefficients of the ambient, specular, and diffuse shading? Additionally, do we not clamp to zero for each shading rather than clamping after adding them together?

  4. What does it mean by adding RGB components instead of the current grey-scale one? Do we basically manipulate the color output by multiplying it with a scalar when we write the matrix to png?

Thank you.

best regards,
Chen Song Zhang

Unsigned int as the loop variant in the raster.cpp

Hi Professor,

I was doing assignment 6 and have been playing around with as many corner cases as possible. I spotted that using the unsigned int type as the loop variants in the rasterize_triangle() and rasterize_line() methods of the raster.cpp may result in an extremely long loop that seems as if it will never end.
image
This issue happens when ux or uy is an negative number, i.e. the triangle to be drawn is below or on the left outside of the view space. Since i and j are unsigned integers while ux and uy are signed integers, the comparison between i and ux or j and uy would make the compiler convert both ux or uy into unsigned integers as well. Therefore, if ux or uy are negative integers, they would be converted to some very large unsigned integers which causes the problem like this.
image

Possible fix: initialize the loop variants i and j as normal int.

Some confusion for Assignment 5

Dear Professor:

I have some confusion for Assignment 5:

  1. Why the smaller the z value the closer to us it is?
  2. Should we load the entire scene with materials, lights, and objects like Assignment 4 or just load the xxx.off obj?
  3. I also see the xxx.obj files in data folder, is any start code to load them?

Thanks

grid position problem

In line 227 the code init the grid at [x,y,-5] and the camera is at [0,0,5]. But the z value of all spheres is 1 or -1. Is that means that all sphere is between the camera and the grid?

Intersection of ray with parallelogram

Will we need to use the even/odd test from Assignment 1 to determine whether a ray intersects the parallelogram (if we obtain intersection with plane), or is there a simpler method we can use?

How to compute scale_x and scale_y in Assignment 3?

I assume the theta in the problem description is the "field_of_view". How can I get scale_x and scale_y from "field_of_view" and "aspect_ratio"? I tried tan(theta/2) * f to compute h, which I think is scale_x, but this quantity is apparently larger than 1. I also tried to search in the textbook, but nothing was found. In addition, I notice there is a quantity called "LensRadius" in the scene.json. Is this quantity also needed to compute scale_x and scale_y?

Position of camera in assignment 2

The starter code seems to put the origin at (-1, 1, 1), and the comment is "The camera is perspective, pointing in the direction -z and covering the unit square (-1,1) in x and y".

To me, it seems like the camera is on the plane of the pixels, if it is on the unit square (-1, 1). However, in class, it looked like the drawings showed the camera centered on the pixels, and was a bit removed from the plane of the pixels. Where exactly is the camera?

Segmentation fault with solver

My program compiles, but I get a segmentation fault when I call the solver for a system of linear equations:

Vector3d x = A.colPivHouseholderQr().solve(bmat);

I have defined A with type Matrix3d and bmat with type Vector3d. What might be going wrong here? I did not have any problems when I used this in my last assignment.

How to skip a line in C++?

I am working on the function load_xyz. Since we have a dummy line in the input file, how to skip that line?

Sphere perspective projection

May I ask why sphere looks like an ellipsoid when I moved it along x-axis by perspective projection? Did I implement anything wrong?

Getting some empty patches in the Ray Traced Dragon

Hi

I implemented bounded box intersection check and AABB tree for rendering dragon and bunny. It seems to be working fine for the bunny, but the final image that I am getting for the dragon has some empty patches. Since both of these are using precisely the same algorithm, I am not able to understand why there is a difference then.

So, is this some error?

If yes, what can be the probable causes for this?

raytrace_bunny_bvh

raytrace

How to open .obj file with meshlab in command tool?

In the assignment 1, Ex.1. The instruction saying

Once you complete the assignment, you should be able to open the resulting file with Meshlab:
meshlab output.obj

Can I ask how to install the meshlab cli? I can't find much information related on google.

Possible Typo on Notes 5?

On Keynote 05 - Ray Tracing
Slide 26 - Ideal Reflections

For what the image shows
image

Should the formula to calculate r be r=d-2(d•n)n, instead of r=d+2(d•n)n?

since the project of d is pointing to the opposite direction from n, the scalar will be negative. Will the result of 2(d•n)n a vector pointing down?

Assignment 3 Inquiry Focal Length

Good Evening,

In the Json file, the camera is located at +5 in the z-axis but the grid is located at -5 in the z-axis as defined in the program. However, the focal length is also 5.0 in the Json file. Is this an error or am I missing something for the definition of the focal length?

Diffuse light in rasterization

I am trying to transfer my specular and diffuse lighting equations from assignment 2 for the flat shading part of assignment 4. I believe the light direction should now be the light position minus the ray_intersection, normalized. However, in rasterization, there is no ray. Should you still be computing the intersection of the light ray and the triangle in order to calculate the lighting? Is the position and location of the light and view direction part of the uniform?

AABBTree bottom-up construction

Hello Professor,

Can I ask how long would it normally take to build the AABB tree for the dragon.off in the bottom-up way? I tried my implementation on the bunny.off and it worked, although it took much longer than the top-down approach. But when I use it for the dragon it seems to take forever to build the tree, so I am not sure whether there is something wrong with my implementation.

I tried the increase of volume and the distance between centroids as the criteria for choosing two nodes to pair. Both attempts failed to build the tree for dragon in acceptable time. It seems in every step I have to scan through all possible nodes to find the pair with the lowest cost, will this be too expensive (especially for the dragon with over 800k triangles)?

Thanks.

Question about Ex.2 Shading

For homework 2 ex.2 "shading section", Are we supposed to simulate both orthographic and perspective views with shading for sphere and parallelogram (thus, 4 png files with shading for Ex.2)?

Problem with prepare_assignments.sh

I got the following error while trying to run prepare_assignments.sh (sh prepare_assignment.sh Assignment_2 Assignment_2/writeup.pdf)

prepare_assignments.sh: 9: prepare_assignments.sh: Syntax error: end of file unexpected (expecting "then")

Is it ok if we just zip the folder with the writeup included and submit that?

Assignment 4

If we only implement the 5 tasks from issue #42, then we can't render the buddy and the dragon. Is that correct?

Question on load obj

Does input obj file of Assignment 1 contains all possilble types?
Or just like the example input, which has only v and f?

AABB Tree Construct

The assignment document indicates two implementations of AABB tree construction: top-down and bottom-up. Do we need to implement both construction methods, or either one of them is fine?

Submission of the animation in Assignment 3

Dear professor or assistants,

I'm sorry to ask this question so late. I want to know since it is a little difficult to add gif animation into latex, so for the result of Ex.5, can I put the gif into the zip file as a part of my report? Since I don‘t know whether you will rate the animation by running my code, or mainly by viewing the report, so I think I need to explain the missing part of my report.

-Zhiliang

Assignment 3 - Result validation

Hi Professor,

Should our result for assignment 3 look exactly like your image posted if we using the default setting in scene.json file? In other word, if my output image looks slightly different, does that means something went wrong in my code?

result

Thanks!

Do we need reflection and refraction in Assignment 4?

Hi professor, I guess your sample pictures are generated without reflection and refraction parts, right? I found if I add reflection and refraction, then the recursions significantly slow down my program. So do I need them for this assignment?

How long should the dragon take

I finished implementing the AABBtree and when I tried it with the bunny it went very fast (not instantaneous, but a matter of seconds).
I then went on to try with the dragon, and 20 minutes later, it is still building the tree and hasn't even started rendering the scene. Is this normal/ To be expected? What is a reasonable time that the dragon should take with our accelerated data structure?

basic raytracing question

when raytracing a parallelogram, we are instructed to "Compute the normal of that intersection point." in the 4th step. But in the code, there is already a way to compute the norm by calling ray_intersection.normalized(). Should we modify this part? if so, can you give some hints?

obj file format

Do we have to be robust to all types of lines in the obj file? The explanation details v and l lines, but the polygon given to us has v and f lines. Also, can we assume there is only one f line?

Extra line in points.xyz file

When I downloaded points.xyz, it seems I got an extra line (line 3002) at the end of the file. Can I assume that when my code is being run by the grader, the points.xyz file will be as it is on the Github with no extra line at the end?

Definition of salient angle

What is the definition of "salient angle"? I was not able to find a consistent definition online. Thank you!

Distortion in perspective raytracing

Hello,

I am trying to generalize the code for raytracing a sphere so that it can work with spheres at any position, but I find that the result I got seems distorted when the sphere is not in the center of the image. The result image seems somewhat oval (as shown below).
I am wondering whether this is possible, or it should never happen so there are bugs in my code?
shading
Thanks in advance.

Mysteriously resolved - fatal error: 'wchar.h' file not found

My project was running normally yesterday, but today I get an error when I run make: fatal error: 'wchar.h' file not found
I have not installed anything or done anything to change header files or anything funky on my laptop. I am running MacOS Big Sur 11.6. What can I do to fix this? I cannot make any progress with this error.

Normals produce static error

Hello,

I am trying to get the normals of the triangles in the mesh for the flat shading portion of the assignment. To do so I have the following snippet of code.

Eigen::Vector3f a = V.row(F(i,0)).cast<float>();
Eigen::Vector3f b = V.row(F(i,1)).cast<float>();
Eigen::Vector3f c = V.row(F(i,2)).cast<float>();

Eigen::Vector3d normal_3d = ((b-c).cross(c-a)).normalized();

The V and F matrices are exactly the same as they were in the starter code of the last assignment. However, I am getting the following error:

Undefined symbols for architecture x86_64:
  "Eigen::MatrixBase<Eigen::Matrix<float, 3, 1, 0, 3, 1> >::cross_product_return_type<Eigen::Matrix<float, 3, 1, 0, 3, 1> >::type Eigen::MatrixBase<Eigen::Matrix<float, 3, 1, 0, 3, 1> >::cross<Eigen::Matrix<float, 3, 1, 0, 3, 1> >(Eigen::MatrixBase<Eigen::Matrix<float, 3, 1, 0, 3, 1> > const&) const", referenced from:
      load_off_as_triangles_flat(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, Eigen::Matrix<double, -1, -1, 0, -1, -1>&, Eigen::Matrix<int, -1, -1, 0, -1, -1>&, std::__1::vector<VertexAttributes, std::__1::allocator<VertexAttributes> >&) in main.cpp.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [assignment5] Error 1
make[1]: *** [CMakeFiles/assignment5.dir/all] Error 2
make: *** [all] Error 2

I am not sure how to parse this error, and I haven't been able to turn up anything on Google. What does this error mean? Why am I getting it?

Assignment 2 grade?

When will assignment 2's grades be out? I would like to know whether I messed anything up, so I do not build on faulty code for assignment 3.

Assignment 3 Refraction Inquiry

Good evening,

I am just wondering if there are papers or instructions on how to calculate the color of the refracted ray. For example, what is the total internal reflection, and what is considered a "small" incident direction (near 0 degrees to -norm)? Should we account for how many bounces are left, i.e. will refraction also be recursive since reflected rays will possibly hit other objects? Additionally, it seems that the refraction index is 1 in the .json file. Does this mean that all spheres have refraction index 1 or is the refraction index 0 for all materials since their "Material" value is 0? I am quite confused about the refraction perspective of this assignment.

Assignment 1 Meshlab

I downloaded Meshlab, and I am trying to use it to check the output of assignment 1. Unfortunately, it looks like it doesn't work as a terminal command, but only works through the GUI for me. I have a Mac computer. Is there an extra step necessary to make it work through the terminal command? Alternatively, how do you open up a .obj file properly through the GUI?

Clarification for the tasks required for assignment 4

I wanted to clarify that for assignment 4 (mandatory one), we only need to implement the following five tasks:

  1. Intersection between triangles and mesh
  2. Ray tracing of triangle and mesh
  3. Intersection of a ray and bounding box
  4. Implementation of AABB
  5. Intersection of ray and mesh using AABB

There are many TODOs in the code for assignment 4, like depth of field, the intersection of sphere and parallelogram, shadow rays, and specular contribution reflected ray and refracted ray, find the nearest and is light visible function.

We don't need to add all these stuff like shadows, reflection, etc for assignment 4? There are only supposed to be for assignment 3, right?

Centroid of bounding box

I have found this method in Eigen that says it produces the centroid of the bounding box: EIGEN_EXPR_BINARYOP_SCALAR_RETURN_TYPE().

The documentation is here.

However, I'm unsure how to actually use it. I think the centroid of the box should be a Vector3d, since it's just a point, but when I try to call it I get
error: no member named 'EIGEN_EXPR_BINARYOP_SCALAR_RETURN_TYPE' in 'Eigen::AlignedBox<double, 3>'

How do I get this value?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.