Quake 3 lightmaps - PVR Multi-Texture
- PH3NOM
- DC Developer
- Posts: 576
- https://www.artistsworkshop.eu/meble-kuchenne-na-wymiar-warszawa-gdzie-zamowic/
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Quake 3 lightmaps - PVR Multi-Texture
So, I had started working on some code to parse and render Quake 3 bsp's.
viewtopic.php?f=29&t=102059&start=20#p1034000
I put a pause on that when I realized that for more reason than one, KGLX was not really well suited for such a task.
As I have since decided to build my own GL api, I have enough finished to start looking back at getting Quake 3 bsps running.
One thing that bothered me, was getting lightmaps to render on the DC.
Essentially, Q3 lightmaps are just textures stored in the .bsp file, that are rendered on top of the base texture.
More modern builds of GL support multitexture, so the solution is quite simple in that case.
However, using the DC's PVR, the only way I can imagine is by sending each vertex twice to the PVR, once with the base texture data, and then again with a translucent vertex with the lightmap texture data.
From what I can tell, that is the solution this guy concluded
http://yam.20to4.net/dreamcast/index_old.html
Any thoughts welcome
viewtopic.php?f=29&t=102059&start=20#p1034000
I put a pause on that when I realized that for more reason than one, KGLX was not really well suited for such a task.
As I have since decided to build my own GL api, I have enough finished to start looking back at getting Quake 3 bsps running.
One thing that bothered me, was getting lightmaps to render on the DC.
Essentially, Q3 lightmaps are just textures stored in the .bsp file, that are rendered on top of the base texture.
More modern builds of GL support multitexture, so the solution is quite simple in that case.
However, using the DC's PVR, the only way I can imagine is by sending each vertex twice to the PVR, once with the base texture data, and then again with a translucent vertex with the lightmap texture data.
From what I can tell, that is the solution this guy concluded
http://yam.20to4.net/dreamcast/index_old.html
Any thoughts welcome
-
- DC Developer
- Posts: 142
- Joined: Thu Apr 03, 2008 7:01 am
- Has thanked: 0
- Been thanked: 4 times
- Contact:
Re: Quake 3 lightmaps - PVR Multi-Texture
Can't you aggregate the two textures in vram, and render only the final texture?
- Bouz
- DCEmu Junior
- Posts: 46
- Joined: Mon May 10, 2010 3:42 pm
- Location: St. Bauzille de Putois (France)
- Has thanked: 0
- Been thanked: 0
Re: Quake 3 lightmaps - PVR Multi-Texture
You would probably have some strange when trying to display 2 triangles with the same coordinates. My feeling is that it would produce flickering unexpected results.
Are there many possible combinations of textures? I imagine it is nt possible to pre-aggregate them?
Are there many possible combinations of textures? I imagine it is nt possible to pre-aggregate them?
-
- DC Developer
- Posts: 9951
- Joined: Sun Dec 30, 2001 9:02 am
- Has thanked: 0
- Been thanked: 1 time
Re: Quake 3 lightmaps - PVR Multi-Texture
There's no multitexture on the PVR, so you'd have to do multiple rendering passes.
That's what Quake 3 did on cards that didn't support multitexture, and for surfaces that needed more than two passes (some of the surfaces can have two or three layers, plus the lightmap). It's also what Quake 1 and 2 did on cards that didn't support multitexture.
Ayla / Bouz - Yes, it's possible to combine the textures with the lightmaps. That's actually how the software renderer in Quake 1 and Quake 2 works. It combines the textures and lightmaps, stores them in a surface cache, and then just draws from the cache using normal texture mapping. GLQuake and Quake 2 in OpenGL mode don't do this, because if you have hardware with fast alpha blending (3Dfx Voodoo cards, for instance) it's actually slower than just doing two rendering passes.
Quake 3's surfaces are too complicated to combine like that, because they can be animated, and have way more than just texture + lightmap.
Bouz - The flickering is called Z fighting. Usually, if you use exactly the same geometry (same vertices, sent in the same order) and the right depth compare mode, it won't be a problem. If it is, you can offset the lightmaps slightly from the surface, but that's kind of tricky to get right. I've never needed to do that on a PC though, and I don't think Quake / Quake 2 / Quake 3 did either.
That's what Quake 3 did on cards that didn't support multitexture, and for surfaces that needed more than two passes (some of the surfaces can have two or three layers, plus the lightmap). It's also what Quake 1 and 2 did on cards that didn't support multitexture.
Ayla / Bouz - Yes, it's possible to combine the textures with the lightmaps. That's actually how the software renderer in Quake 1 and Quake 2 works. It combines the textures and lightmaps, stores them in a surface cache, and then just draws from the cache using normal texture mapping. GLQuake and Quake 2 in OpenGL mode don't do this, because if you have hardware with fast alpha blending (3Dfx Voodoo cards, for instance) it's actually slower than just doing two rendering passes.
Quake 3's surfaces are too complicated to combine like that, because they can be animated, and have way more than just texture + lightmap.
Bouz - The flickering is called Z fighting. Usually, if you use exactly the same geometry (same vertices, sent in the same order) and the right depth compare mode, it won't be a problem. If it is, you can offset the lightmaps slightly from the surface, but that's kind of tricky to get right. I've never needed to do that on a PC though, and I don't think Quake / Quake 2 / Quake 3 did either.
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
Thank you for the informative, and obviously experienced responce.BlackAura wrote:There's no multitexture on the PVR, so you'd have to do multiple rendering passes.
That's what Quake 3 did on cards that didn't support multitexture, and for surfaces that needed more than two passes (some of the surfaces can have two or three layers, plus the lightmap). It's also what Quake 1 and 2 did on cards that didn't support multitexture.
That confirms my findings. Forgive me if I ask you more questions in the future
As BlackAura mentioned, as long as you submit the verices correctly, there should be no problems.Bouz wrote:You would probably have some strange when trying to display 2 triangles with the same coordinates. My feeling is that it would produce flickering unexpected results.
In fact, the results can look quite nice, notice the blood splatter blends with the floor texture underneath, using a two-pass approach with the PVR.
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
Bump on old thread
Thank you again BlackAura for your insight.
I never posted my results, after working on getting Quake 3 BSP's to render with Light Maps, using my build of OpenGL.
After I posted this, I had stopped working on the code and focused on other things.
However, I have recently started again from the beginning.
Interesting thing about the Quake 3 BSP format, is that the Light Maps are stored in the BSP file itself, while the decal, or base
textures are stored externally.
The light maps are stored as 24bit RGB textures, so all I needed to do was convert the 24bit colors to 16bit for use with the
PVR's 16bit RGB565 texture color format, and then bind to OpenGL.
For now, I am making a 3-pass render approach using hardware blending to achieve a multi-texture effect:
Pass 1: Submit Opaque Geometry
Pass 2: Submit Transparent Geometry ( determined by texture flags )
Pass 3: Submit Light Map Geometry
First Attempt: blending modes are set wrong!
Adjusted the blending mode to smooth things out...
CDi Demo of Posted Screens:
Thank you again BlackAura for your insight.
I never posted my results, after working on getting Quake 3 BSP's to render with Light Maps, using my build of OpenGL.
After I posted this, I had stopped working on the code and focused on other things.
However, I have recently started again from the beginning.
Interesting thing about the Quake 3 BSP format, is that the Light Maps are stored in the BSP file itself, while the decal, or base
textures are stored externally.
The light maps are stored as 24bit RGB textures, so all I needed to do was convert the 24bit colors to 16bit for use with the
PVR's 16bit RGB565 texture color format, and then bind to OpenGL.
For now, I am making a 3-pass render approach using hardware blending to achieve a multi-texture effect:
Pass 1: Submit Opaque Geometry
Pass 2: Submit Transparent Geometry ( determined by texture flags )
Pass 3: Submit Light Map Geometry
First Attempt: blending modes are set wrong!
Adjusted the blending mode to smooth things out...
CDi Demo of Posted Screens:
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
Bigger maps.
Reduced the Vertex Buffer size of Open GL to better match the PVR's limit, and increase the PVR's available texture memory.
LightMaps Disabled:
LightMaps Enabled ( still need to work out a few details )
Reduced the Vertex Buffer size of Open GL to better match the PVR's limit, and increase the PVR's available texture memory.
LightMaps Disabled:
LightMaps Enabled ( still need to work out a few details )
-
- Insane DCEmu
- Posts: 112
- Joined: Sat Sep 22, 2007 9:43 pm
- Location: Braga - Portugal
- Has thanked: 0
- Been thanked: 0
Re: Quake 3 lightmaps - PVR Multi-Texture
I just wished I had his source code [http://yam.20to4.net/dreamcast/index_old.html]. I wonder how much I could learn from them.PH3NOM wrote:So, I had started working on some code to parse and render Quake 3 bsp's.
viewtopic.php?f=29&t=102059&start=20#p1034000
I put a pause on that when I realized that for more reason than one, KGLX was not really well suited for such a task.
As I have since decided to build my own GL api, I have enough finished to start looking back at getting Quake 3 bsps running.
One thing that bothered me, was getting lightmaps to render on the DC.
Essentially, Q3 lightmaps are just textures stored in the .bsp file, that are rendered on top of the base texture.
More modern builds of GL support multitexture, so the solution is quite simple in that case.
However, using the DC's PVR, the only way I can imagine is by sending each vertex twice to the PVR, once with the base texture data, and then again with a translucent vertex with the lightmap texture data.
From what I can tell, that is the solution this guy concluded
http://yam.20to4.net/dreamcast/index_old.html
Any thoughts welcome
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
Forgot to mention about the Frame Rate: those screens are without using the BSP PVS system, and without any sort of Frustum Culling.
My most recent build has implemented the BSP PVS system ( using the SH4's fast Vector Math Instructions ), and z-frustum culling ( could be optimized to full frustum culling ), and then finally using the facets of my Open GL API ( Near-Z Clipping ).
In recent time, I have just decided to add software mip-map generation for my build of Open GL:
(the left image is the 512x512 texture scaled down to 320x320 by the PVR hardware, the right image is my software-generated mip-map at 256x256 being scaled up to 320x320 by the PVR hardware.
I have been thinking of a way to do multi-texture much faster than the way I am currently doing it using my build of Open GL.
Basically right now, making 2 passes, I am submitting the geometry twice for each vertex.
This means each vertex gets possibly ( clipped, light, transformed ) each time submitted.
My idea is I can simply allow the submission of two separate textures ( opaque + alpha ) with almost no extra cost on the CPU, by computing the output vertex ( light, clipped, transformed ), then copy into each list ( opaque, alpha ).
My most recent build has implemented the BSP PVS system ( using the SH4's fast Vector Math Instructions ), and z-frustum culling ( could be optimized to full frustum culling ), and then finally using the facets of my Open GL API ( Near-Z Clipping ).
I agree; it is a shame he has never posted on these forums or his sources.Jae686 wrote:I just wished I had his source code [http://yam.20to4.net/dreamcast/index_old.html]. I wonder how much I could learn from them.
In recent time, I have just decided to add software mip-map generation for my build of Open GL:
(the left image is the 512x512 texture scaled down to 320x320 by the PVR hardware, the right image is my software-generated mip-map at 256x256 being scaled up to 320x320 by the PVR hardware.
I have been thinking of a way to do multi-texture much faster than the way I am currently doing it using my build of Open GL.
Basically right now, making 2 passes, I am submitting the geometry twice for each vertex.
This means each vertex gets possibly ( clipped, light, transformed ) each time submitted.
My idea is I can simply allow the submission of two separate textures ( opaque + alpha ) with almost no extra cost on the CPU, by computing the output vertex ( light, clipped, transformed ), then copy into each list ( opaque, alpha ).
-
- Insane DCEmu
- Posts: 112
- Joined: Sat Sep 22, 2007 9:43 pm
- Location: Braga - Portugal
- Has thanked: 0
- Been thanked: 0
Re: Quake 3 lightmaps - PVR Multi-Texture
And how does he make the radial blur ? does he copy the whole framebuffer to main memory and perform the blur ? (And if so, how can I get a pointer to the frame buffer) ?
- BlueCrab
- The Crabby Overlord
- Posts: 5666
- Joined: Mon May 27, 2002 11:31 am
- Location: Sailing the Skies of Arcadia
- Has thanked: 9 times
- Been thanked: 69 times
- Contact:
Re: Quake 3 lightmaps - PVR Multi-Texture
You can have the PVR render the screen to a texture, which you'd have a pointer to (by virtue of the fact that you have to supply the texture pointer to render to). That's the easiest way to do it, probably.
Otherwise, if you really wanted to hack away at low-level stuff, you could get the frame buffer pointer and go that route as well. Just remember that if you do that and want to use it as a texture later, you'll either have to set it up as a strided texture or you'll have to resize it (whereas the render-to-texture stuff in KOS requires a power-of-two-sized texture to start with, so you can easily use it later as a regular texture).
Otherwise, if you really wanted to hack away at low-level stuff, you could get the frame buffer pointer and go that route as well. Just remember that if you do that and want to use it as a texture later, you'll either have to set it up as a strided texture or you'll have to resize it (whereas the render-to-texture stuff in KOS requires a power-of-two-sized texture to start with, so you can easily use it later as a regular texture).
-
- Insane DCEmu
- Posts: 112
- Joined: Sat Sep 22, 2007 9:43 pm
- Location: Braga - Portugal
- Has thanked: 0
- Been thanked: 0
Re: Quake 3 lightmaps - PVR Multi-Texture
So If i understood correctly I would basically allocate a texture in video memory render to it -> copy it into main memory -> do any sort of post-processing and then copy it again to VRAM? (I assume that messing directly on the VRAM would be slow).BlueCrab wrote:You can have the PVR render the screen to a texture, which you'd have a pointer to (by virtue of the fact that you have to supply the texture pointer to render to). That's the easiest way to do it, probably.
Otherwise, if you really wanted to hack away at low-level stuff, you could get the frame buffer pointer and go that route as well. Just remember that if you do that and want to use it as a texture later, you'll either have to set it up as a strided texture or you'll have to resize it (whereas the render-to-texture stuff in KOS requires a power-of-two-sized texture to start with, so you can easily use it later as a regular texture).
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
Even on modern hardware, reading pixels from the GPU back to the CPU is a very slow process!
I have done so using Windows GDI, DirectX, as well as OpenGL.
Out of those, Open GL was the fastest mode, but it only copied its own frame buffer data, not that of Windows, so it did not do what I needed it to.
The CPU I am using is an AMD FX 8320, and GPU is AMD HD 7870, and struggled to hit 30fps at 1920x1080p.
This is an area where the new game consoles 'unified memory' architecture really has an advantage...
I have considered compressing the Frame Buffer on the GPU before sending it back to the CPU, but that is a different story altogether
Any how, that is not how the 'cheap' radial blur effect is achieved.
First, this would be hard to do in KGLX due to lack of render-to-texture. In my build of Open GL this should be easy to implement.
Take a look here (Dave brought this to my attention):
http://nehe.gamedev.net/tutorial/radial ... ure/18004/
It is a pretty simple effect that looks interesting ( I have not tried it yet, but will soon )
Basic overview ( If I understand correctly from taking a very quick look at that page, Dave feel free to correct me):
1.) Set viewport to a smaller region of the screen ( the size of your render-to-texture )
2.) Render your entire scene at the smaller viewport to a texture
3.) Set the viewport to the full screen
4.) Render your entire scene again
5.) With blending enabled, make a loop over the rendered texture, drawing as 2d vertices to be overlaid on top of the 3d scene.
Each loop gradually changes the u/v values to gradually zoom in on the texture, while decreasing the alpha to fade the texture out.
I now plan to have a go at this effect, If successful I will add it as an example for my Open GL API.
I have done so using Windows GDI, DirectX, as well as OpenGL.
Out of those, Open GL was the fastest mode, but it only copied its own frame buffer data, not that of Windows, so it did not do what I needed it to.
The CPU I am using is an AMD FX 8320, and GPU is AMD HD 7870, and struggled to hit 30fps at 1920x1080p.
This is an area where the new game consoles 'unified memory' architecture really has an advantage...
I have considered compressing the Frame Buffer on the GPU before sending it back to the CPU, but that is a different story altogether
Any how, that is not how the 'cheap' radial blur effect is achieved.
First, this would be hard to do in KGLX due to lack of render-to-texture. In my build of Open GL this should be easy to implement.
Take a look here (Dave brought this to my attention):
http://nehe.gamedev.net/tutorial/radial ... ure/18004/
It is a pretty simple effect that looks interesting ( I have not tried it yet, but will soon )
Basic overview ( If I understand correctly from taking a very quick look at that page, Dave feel free to correct me):
1.) Set viewport to a smaller region of the screen ( the size of your render-to-texture )
2.) Render your entire scene at the smaller viewport to a texture
3.) Set the viewport to the full screen
4.) Render your entire scene again
5.) With blending enabled, make a loop over the rendered texture, drawing as 2d vertices to be overlaid on top of the 3d scene.
Each loop gradually changes the u/v values to gradually zoom in on the texture, while decreasing the alpha to fade the texture out.
I now plan to have a go at this effect, If successful I will add it as an example for my Open GL API.
-
- Insane DCEmu
- Posts: 112
- Joined: Sat Sep 22, 2007 9:43 pm
- Location: Braga - Portugal
- Has thanked: 0
- Been thanked: 0
Re: Quake 3 lightmaps - PVR Multi-Texture
Are you planning on adding support to vertex arrays on your API ?
When will we have a chance to give a try on your GL API (I'm getting quite curious ) ?
When will we have a chance to give a try on your GL API (I'm getting quite curious ) ?
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
Yeah, I have implemented Vertex Arrays, in the form of glDrawArrays(...).
However, glDrawElements(...) not yet supported. If that is of interest to you, let me know, I will up that on my to-do list.
And things are working out nicely, as todays experiment has turned out successfully and allowed me to spot a few lingering bugs in the API, and fix them
Stay posted for an official release very soon.
Today in some short spare time I have successfully implemented the 'radial blur' effect on DC, using my Open GL API.
Radial Blur Disabled:
Radial Blur Enabled:
Looks pretty cool in motion, I have uploaded an elf here: First off, we need to allocate some texture memory for the Render-To-Texture:
After that, rendering the scene is pretty much what I described earlier:
From that, the 'radial blur' is achieved by submitting the render-to-texture result as a 2D quad overlaid on top of the 3D scene:
However, glDrawElements(...) not yet supported. If that is of interest to you, let me know, I will up that on my to-do list.
And things are working out nicely, as todays experiment has turned out successfully and allowed me to spot a few lingering bugs in the API, and fix them
Stay posted for an official release very soon.
Today in some short spare time I have successfully implemented the 'radial blur' effect on DC, using my Open GL API.
Radial Blur Disabled:
Radial Blur Enabled:
Looks pretty cool in motion, I have uploaded an elf here: First off, we need to allocate some texture memory for the Render-To-Texture:
Spoiler!
Spoiler!
Spoiler!
- PH3NOM
- DC Developer
- Posts: 576
- Joined: Fri Jun 18, 2010 9:29 pm
- Has thanked: 0
- Been thanked: 5 times
Re: Quake 3 lightmaps - PVR Multi-Texture
As it turns out, for radial blur, it is not necessary to render at a smaller viewport.
And I would actually advise against it. Just render to your texture at your normal viewport.
Why? Because I have just implemented a way to copy your submitted vertex data to the PVR for render-to-texture without having to wipe the buffers in main ram.
This means instead of having to compute and submit the final vertex twice( light, clipped, transformed ), it will only need to be calculated once.
But this only works right if you do not change the viewport.
The only drawback is that the PVR memory requires a 1024x512 texture to fit the 640x480 screen, as opposed to rendering to a smaller texture.
Here is the updated code from above:
And I would actually advise against it. Just render to your texture at your normal viewport.
Why? Because I have just implemented a way to copy your submitted vertex data to the PVR for render-to-texture without having to wipe the buffers in main ram.
This means instead of having to compute and submit the final vertex twice( light, clipped, transformed ), it will only need to be calculated once.
But this only works right if you do not change the viewport.
The only drawback is that the PVR memory requires a 1024x512 texture to fit the 640x480 screen, as opposed to rendering to a smaller texture.
Here is the updated code from above:
Code: Select all
if(enable_radial) /* Render scene with radial blur */
{
/* Draw the GL "scene" */
draw_gl();
/* Render the submitted vertex data to a Texture */
glutCopyBufferToTexture(RENDER_TEXTURE, &RENDER_TEXTURE_W, &RENDER_TEXTURE_H);
/* Now, Render the "Radial Blur" Post-Process Effect */
RenderBlurEffect(25, 0.02f);
/* Submit Vertex Data to GPU for Display */
glutSwapBuffers();
}
else /* Render scene with no radial blur */
{
/* Draw the GL "scene" */
draw_gl();
/* Submit Vertex Data to GPU for Display */
glutSwapBuffers();
}
- Christuserloeser
- Moderator
- Posts: 5948
- Joined: Thu Aug 28, 2003 12:16 am
- Location: DCEvolution.net
- Has thanked: 10 times
- Been thanked: 0
- Contact: