The test is as follows:
Scenario: Intersecting a scaled sphere with a ray
Given r ← ray(point(0, 0, -5), vector(0, 0, 1))
And s ← sphere()
When set_transform(s, scaling(2, 2, 2))
And xs ← intersect(s, r)
Then xs.count = 2
And xs[0].t = 3
And xs[1].t = 7
However, I get 1.5
and 3.5
respectively.
Looks like I have some misunderstanding about how to do the calculations, because both on paper and in the code I arrive at the same conclusion. So, I will write out my intermediate calculations.
We start by creating a scaling matrix, which is the transformation attached to the sphere:
2 0 0 0
0 2 0 0
0 0 2 0
0 0 0 1
Then we invert it (since we need to transform the ray by the inverse of sphere’s transformation):
0.5 0 0 0
0 0.5 0 0 = F
0 0 0.5 0
0 0 0 1
Then we multiply original ray’s point component with it and get:
0 0
F * 0 = 0
-5 -2.5
Then we multiply original ray’s vector component with it and get:
0 0
F * 0 = 0
1 0.5
So, our new Ray
is { Point(0,0,-2.5), Vector(0,0,0.5) }
, which is a point that lies on Z axis, looking towards the sphere.
To demonstrate that the algorithm for finding intersections between ray and sphere is correct (without writing it out here), we calculate it backwards: we know that the sphere is radius 1, centered at (0,0,0). My intersections are: 1.5
, 3.5
, which are distances from ray origin (-2.5
) to points on the sphere. -2.5 + 1.5 = -1
and -2.5 + 3.5 = 1
, 1 - (-1) = 2
, which is the sphere diameter, so it makes sense? Or does it?