What do you mean by "rotate objects"? Do you mean, rotate the texture on the polygon you are drawing, or do you mean rotate the entire polygon? Both are done in the draw_sprite() function according to the tutorial. A polygon is defined as two vertices in the example: float x0, float y0, float x1, float y1. These 4 values represent the two corners of the polygon, xy0 is the top left corner, and xy1 is the bottom right corner. These are the on-screen positions where the corners lie.
To rotate the polygon, you need to do some trig on these vertices to rotate them around an origin. The appropriate way to do this is to pack a matrix with the correct values, then use the dreamcast's built in sh4 math functions to perform fast math on the vertices that way, but that's way beyond this quick post. A simpler solution is to calculate out the resultant location of those 2 vertices with simple one-line formulas derived from our matrices math in the first place.
To rotate a point in space, you calculate out the x' and y' positions for each vertex by using their original x and y values. By "original x and y values" I mean the values that they resided in when they formed a square polygon before any rotation, so their original values are as follows:
Code: Select all
float x0 = x;
float y0 = y;
float x1 = x + sprite->width;
float y1 = y + sprite->height;
where "x" and "y" are the offset for the sprite, and x + width and y + height are the bottom corner of the sprite plus the offset on screen. The formula to rotate is as follows:
Code: Select all
x0' = cos(angle)*x0 - sin(angle)*y0
y0' = sin(angle)*x0 + cos(angle)*y0
x1' = cos(angle)*x1 - sin(angle)*y1
y1' = sin(angle)*x1 + cos(angle)*y1
Note that angle here is in radians, not degrees. To get the angle in radians, you multiply your angle in degrees by pi/180. The above four formulas will transform your xy0 into xy0' and xy1 into xy1', which is the final screen position where they should reside to represent that polygon rotated by your angle in radians on screen.
This rotates the polygon physically on screen, but if you want the polygon to remain stationary and merely rotate the texture mapping of the polygon, the process is much the same except instead of manipulating xy0 and xy1 which are the vertices, you manipulate uv0 and uv1, which are the texture mapping coordinates. These coordinates map to points on on the texture which the PVR will sample from when filling the polygon. Unlike xy0 and xy1, these are not absolute screen coordinates, they instead represent coordinates as percentages of the texture. I.e. instead of saying that the top left corner of your polygon maps to an arbitrary position 10 texels right and 10 texels down in your 100x100 texture, you instead say your top left corner of your polygon maps to 10% right and 10% down in your 100x100 texture, meaning you divide your texel position by the texture width/height and multiply to get the percentage. UV mapping uses percentages of textures for its coordinate system instead of absolute texel positions, because things like mipmaps exist where you will switch between textures of different sizes that are pre-scaled as a performance optimization, and you want to make sure you're sampling the same part of the texture regardless of the mipmap size.
To rotate the uv coordinates, just use the same rotation formulas from above.