Using blender models in KGL (is jfm the best option?)

If you have any questions on programming, this is the place to ask them, whether you're a newbie or an experienced programmer. Discussion on programming in general is also welcome. We will help you with programming homework, but we will not do your work for you! Any porting requests must be made in Developmental Ideas.
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Using blender models in KGL (is jfm the best option?)

Post by Dev » Mon Dec 21, 2015 9:40 am

I'm very proficient with Blender. I also have some experience programming but not graphic. However OpenGL looks straightforward to me.

So I installed the dev environment, can compile *.c examples, get triangles on screen, create my own *cdi images to load into DCEmul. (I don't currently have my DC with me. I know emulation is a bad idea in the long run.)

I'm looking for a way to import 3D geometry with some walk-cycles into my projects. By googling I found:
http://www.jstookey.com/dreamcast/readme.html

Is this the best option up to date or are there better ways? What is the least taxing format? (I want to avoid overloading the CPU with stuff it doesn't have to do) I see in the example he uses *.jpg which is probably using more compute cycles then loading raw bitmaps?

My main concern is performance as I'm not that familiar with how the hardware works and what programming practices slow things down. Need to read more but most DC websites are now gone from the web.
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Sat Dec 26, 2015 7:39 am

I will answer my own question here in case anyone else finds this thread via google.

His python export script works. You just have to use version 2.41. Current Blender has been re-written from scratch therefore it's totally different.
To test you can edit the "dreamcast.blend" file and swap the mesh for your own model. On the KOS side the C program compiles and you can get a working *.cdi image to try on a Dreamcast or in an emulator.
The JPEG is changed to PCX at export, I was wrong in assuming it processes JPEGS on the fly.

If you add more then 2 objets the script crashes

Code: Select all

Traceback (most recent call last):
  File "<string>", line 487, in select_file
  File "<string>", line 404, in export
  File "<string>", line 291, in GetNT
IndexError: list index out of range
It's not the polygon count that causes this but I think the number of meshes. As a test I tried replacing the DC with a heavy 39218 vertex model and it exported OK.

The same way the dreamcast lid opens and closes in the demo you could animate a character walk cycle by making each part of the body a blender object and parenting them to each other so they don't float around while keying. (No skeletal animation or morph targets, just LocRot of objects.) I made and animated a simple mannequin using this keying and "rigging" method and it's serviceable. Doesn't look bad at all. It doesn't export though to *.jfm as the script is partly broken.I looked for other export scripts that were written specially with the DC in mind and it doesn't look like there is anything. At least it's not searchable via google or wayback machine.

This exporter itself is a modified VRML script. It doesn't look too complicated from inside and probably can be re-written for modern Blender based on the X3D exporter. (VRML has been obsoleted and replaced by X3D.)

But I'm pretty sure I'm not the only person here trying to export 3D animations and use them on the dreamcast. (other then Quake and HalfLife ports)

IMHO Instead of everyone writing their own exporter per project we should write a generic one; once and properly making sure it's optimized for DC hardware.
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Sat Dec 26, 2015 11:07 am

Dev wrote: The JPEG is changed to PCX at export, I was wrong in assuming it processes JPEGS on the fly.
...
This exporter itself is a modified VRML script. It doesn't look too complicated from inside and probably can be re-written for modern Blender based on the X3D exporter. (VRML has been obsoleted and replaced by X3D.)
...
But I'm pretty sure I'm not the only person here trying to export 3D animations and use them on the dreamcast. (other then Quake and HalfLife ports)

IMHO Instead of everyone writing their own exporter per project we should write a generic one; once and properly making sure it's optimized for DC hardware.
Dev wrote:My main concern is performance as I'm not that familiar with how the hardware works and what programming practices slow things down. Need to read more but most DC websites are now gone from the web.
Hey Dev,

I strongly agree. Ph3nom and I have been working on simplifying the tools for development.
The OpenGL library has already improved a lot.
Texture conversion tools have improved (allow rectangular instead of square VQ compressed textures).

We definitely need to work on an exporter plugin for current blender. I'm currently doing some reverse engineering work on a Dreamcast 3D model format.
After I'm done with that I'll have to get it into blender and export it for the DC, so this is something I also set my sights on and would be glad to cooperate on.

Regarding performance, in my personal game project I have written a script that converts textures to the most appropriate Dreamcast format. You really don't want to use PCX for this. Try to use VQ or paletted textures instead. You can use a paletted texture for a grayscale picture as well (just set the palette to 0-255). Currently I feel like the DC's texturing capabilities aren't made use of properly, resulting in few and low-resolution textures.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
PH3NOM
DC Developer
DC Developer
Posts: 574
Joined: Fri Jun 18, 2010 9:29 pm

Re: Using blender models in KGL (is jfm the best option?)

Post by PH3NOM » Sun Dec 27, 2015 2:44 pm

That code is from 2006, so it is obviously using the old KGL API, that has been replaced in the latest KOS.
Make sure you are using latest KOS release:
http://sourceforge.net/p/cadcdev/kallis ... ster/tree/

A quick look at the code, one thing that needs to be changed for the new KOS OpenGL API is the draw_scene() on kglgame.cc:
Change this:

Code: Select all

void draw_scene() {
#ifdef DREAMCAST
  glKosBeginFrame();    
  draw_gl();
  glKosFinishFrame();
#else
  draw_gl();
  glFlush();
  glutSwapBuffers();
#endif
}
to this:

Code: Select all

void draw_scene() {
  draw_gl();
  glFlush();
  glutSwapBuffers();
}
The render code should work fine, but is not particularly efficient for the DC.
Does the geometry actually change between quads and tris every primitive?
It might be better to group the quads / tris indices separately per-frame, and then batch the primitives down to a single call to glDrawElements()...

Also, I notice that format does not include vertex normals, I guess that is because the old KGL did not handle vertex lighting...
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Sun Dec 27, 2015 3:10 pm

PH3NOM wrote:That code is from 2006, so it is obviously using the old KGL API, that has been replaced in the latest KOS.
The render code should work fine, but is not particularly efficient for the DC.
Does the geometry actually change between quads and tris every primitive?
It might be better to group the quads / tris indices separately per-frame, and then batch the primitives down to a single call to glDrawElements()...

Also, I notice that format does not include vertex normals, I guess that is because the old KGL did not handle vertex lighting...
I wrote something about model loading/drawing here: viewtopic.php?f=34&t=103358&p=1046135#p1046135
Note how I don't work with individual indices for coordinate, uv and normal, but already assembled the vertices into a struct Vertex so that they can all be pushed to glDrawArrays/Elements.

For the DC it would be worth considering to include a library in the exporter script that creates triangle strips from the triangles and ditch indices, if we can't figure out how to make glDrawElements faster.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Mon Dec 28, 2015 8:49 pm

@PH3NOM
Thanks for the advice. I will use the code when I upgrade KOS.
It might be better to group the quads / tris indices separately per-frame, and then batch the primitives down to a single call to glDrawElements()...
AFAIK all quads are converted to tris at the end as the graphic processor can only understand tris anyway. When exporting geometry I will not use quads at all so I don't have to write extra code. We have an option called "convert quads to tris" in Blender so it's not a problem. If we ever have a new export script for Blender I think we should make it do this for us.

As 4-vertex faces are a different type of data I understand how having both tris and quads would create a processing problem. (I guess you need an extra type of data with an extra column in the table for the 4th xyz value.) I actually have not thought of this until now. Good point.
Also, I notice that format does not include vertex normals, I guess that is because the old KGL did not handle vertex lighting...
Aren't normals determined by the order in which you present the xyz values for each of the 3 vertices in a polygon i.e. clockwise or anti-clockwise?


I see 2 possible ways of making characters animate:

METHOD 1
1. Export different key frames of a walk cycle as a series of 8 *.obj files.
2. Convert OBJs into simpler plain text files containing 3d arrays of vertex positions in space.
3. Have a separate array with timing, just saying what key frame should be loaded on what frame in the loop.
3. Use vertex tweening (LERP?) to blend files mentioned above into animation.

OR

METHOD 2
Would it be easier just to export body parts as meshes and implement bones?
I think a bone as a data type is just 2 vertices (root+tip). Each bone would have a mesh assigned to it i.e. head, neck chest, upper arm, lwr arm, hand etc. I think for a humanoid character 20 bones should be enough. This would also allow for possible expansion to stuff like rag-doll and IK If there’s processing power left for it.

I'm guessing the benefits and drawbacks of both methods are:
* METHOD 1 uses lots of pre-baked data so it will consume a lot of memory but less compute cycles?
* METHOD 2 will be harder on the processor but will require less RAM?

What's the likelihood of the second option being efficient? Are there any retail games using it?
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Mon Dec 28, 2015 9:04 pm

Dev wrote:@PH3NOM
It might be better to group the quads / tris indices separately per-frame, and then batch the primitives down to a single call to glDrawElements()...
AFAIK all quads are converted to tris at the end as the graphic processor can only understand tris anyway.
The DC only accepts tristrips. If you supply anything else, libGL will have to convert the data for you.
If you supply 10 quads, you will get 10 triangle strips, each of length 4.
If you supply 10 triangles, you will get 10 triangle strips, each length 3.
Hence why I mentioned that it would be advantageous to create triangle strips once in the exporter.
Dev wrote:
Also, I notice that format does not include vertex normals, I guess that is because the old KGL did not handle vertex lighting...
Aren't normals determined by the order in which you present the xyz values for each of the 3 vertices in a polygon i.e. clockwise or anti-clockwise?
You're confusing normals with backface culling.
Dev wrote: I see 2 possible ways of making characters animate:
...
I'm guessing the benefits and drawbacks of both methods are:
* METHOD 1 uses lots of pre-baked data so it will consume a lot of memory but less compute cycles?
* METHOD 2 will be harder on the processor but will require less RAM?

What's the likelihood of the second option being efficient? Are there any retail games using it?
METHOD 1 (keyframe animation) is useful for small meshes (a swarm of low-polygon birds).
METHOD 2 is better for meshes of higher polygon count and/or high animation count.

METHOD 2 is used commonly on the Dreamcast. It seems like the standard SDK supplied a 3D model format that had a basic skeleton (.nj for geometry, .nja for animation), e.g. Sonic Adventure, Phantasy Star Online. There's even weighted skeletal animation (Shadow Man). The latter would have to be limited to few characters, maybe only the player character.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Wed Dec 30, 2015 4:15 pm

@bogglez
Do you have a download link for your *.OBJ converter?

What do you use to export only skeletal animation data and how do you handle it?
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Wed Dec 30, 2015 4:33 pm

I have not created an OBJ converter plugin for Blender yet, it's a future plan.
I have however created such a program outside of Blender years ago: https://www.youtube.com/watch?v=y8JsVRAgOJI (this is an old youtube account of mine which I can't access anymore due to the Google buyout..).
It's quite easy to do. I didn't work on an animation format back then though, but it's simple to do.
I'd have to research how to export data from Blender, including animation data. I'm also not a proficient blender user and don't know how to animate using it.

Would you like to work on something like this together?
Roadmap would be:
1. Make a repo with a basic template for a Blender plugin.
2. Export vertices, materials, indices (maybe make indices optional, need to benchmark glDrawArrays vs glDrawElements on DC).
3. Use a library in the plugin to turn the triangles into triangle strips.
4. Export animation.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Wed Dec 30, 2015 9:46 pm

I exported few tests from Blender as *.x3d and checked how they look like from inside. This is the default cube:

Code: Select all

<Coordinate DEF="coords_ME_Cube" point="1.000000 1.000000 -1.000000 1.000000 -1.000000 -1.000000 -1.000000 -1.000000 -1.000000 -1.000000 1.000000 -1.000000 1.000000 0.999999 1.000000 0.999999 -1.000001 1.000000 -1.000000 -1.000000 1.000000 -1.000000 1.000000 1.000000 " />
Looks easy to interpret. Maybe the correct solution i.e. requiring least work and forward-compatible with newer versions of Blender is to write a library for KOS that interprets X3D files instead of an exporter for Blender?
We could start with ignoring everything in the file apart from geometry then progressively support more and more features of x3d.

Am I on the right track?
User avatar
PH3NOM
DC Developer
DC Developer
Posts: 574
Joined: Fri Jun 18, 2010 9:29 pm

Re: Using blender models in KGL (is jfm the best option?)

Post by PH3NOM » Wed Dec 30, 2015 11:04 pm

My engine is currently using the Quake 1 and 2 .mdl / .md2 formats.
These formats use key frame animation, with vertex interpolation that can be tessellated to any factor between key frames.
Currently, I use 10 frames of interpolation between key frames for smooth animation.
Consider each player model has something like 198 key frames. There is no way we can cache each frame, * 10 for interpolated frames, on DC with 16mb of ram.
In the end, you have to build the vertex array each frame for an animated model.

Third-Person Mode
Image

First-Person Mode:
Image
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Wed Dec 30, 2015 11:58 pm

Why is the quake format considered inefficient on the DC?
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Thu Dec 31, 2015 10:32 am

I see a guy here:
https://www.youtube.com/watch?v=jC8VYbxdm7E
working in C with OpenGL (same as everyone here) loading an x3d model.

He is only using one generic library "xmlHelp.h" for interpreting XML.

I need to read up more on how X3D works. From what I see so far x3d seems to be the best option.

I see a lot of opportunities here to get bogged down reinventing the wheel. It might look like I'm lazy but I prefer to think twice and do once. If someone already made the tool I need I'm not going to recreate it just to be cool. It's more productive to spend time making stuff that has not been done yet.

I wonder if somebody has written an x3d library for the pre-shader version of OpenGL we are using.
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Thu Dec 31, 2015 11:09 am

Dev wrote:I see a guy here:
https://www.youtube.com/watch?v=jC8VYbxdm7E
working in C with OpenGL (same as everyone here) loading an x3d model.

He is only using one generic library "xmlHelp.h" for interpreting XML.

I need to read up more on how X3D works. From what I see so far x3d seems to be the best option.

I see a lot of opportunities here to get bogged down reinventing the wheel. It might look like I'm lazy but I prefer to think twice and do once. If someone already made the tool I need I'm not going to recreate it just to be cool. It's more productive to spend time making stuff that has not been done yet.
If you just want to import an existing file format, check for the features it has (e.g. do you need skeletal or key frame animation?) and download an importer library. Something like assimp or that guy's xmlHelp.
Or like Ph3nom you could implement your own little loader for a simple format like https://en.wikipedia.org/wiki/MD2_%28file_format%29
XML is just horrible to work with.
I wonder if somebody has written an x3d library for the pre-shader version of OpenGL we are using.
The loading and rendering are separate. If you can load the data into your graphic engine's data structures it doesn't matter where the data came from..
I don't know about x3d, but blender has support for quake models, I think.

I'm not trying to reinvent the wheel, but I want to make a proper model format for the DC. That is, using texture formats of the PVR instead of TGAs/PNGs, fast to load, simple (no XML etc), efficient vertex format for the PVR (tristrips).
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Sat Jan 02, 2016 3:09 pm

I'm still familiarizing myself with OpenGL and the platform. This is an interesting article:
https://msdn.microsoft.com/en-us/library/ms834190.aspx

I think it should be in a sticky.

The more I read the more I see there's no way around making an exporter. What's the name of that texture format the PVR requires?
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Sat Jan 02, 2016 3:57 pm

List of texture formats: http://gamedev.allusion.net/docs/kos-cu ... _fmts.html

Basically
RGB565 - 5 bits of red, 6 bits of green, 5 bits of blue, no transparency
ARGB1555 - each color channel has 5 bits, transparency is on or off
ARGB4444 - each channel has 4 bits, use if you need a transparent gradient
BUMP - bumpmap format (stores angles for vertex displacement effects in lighting calculations), similar to a normal map
YUV422 - of no use to you, I guess, mostly used for video decoding

In addition, textures can be twiddled, VQ compressed or strided and mipmapped.
Twiddled: pixels aren't stored row-by-row, but in a z-shape arrangement which puts neighboring pixels closer together. This makes texture filtering cheaper and you definitely want to use it. Twiddled textures must have power-of-two dimensions (e.g. 512x512, 256x1024).
VQ compression: vector quantization is a lossy texture compression. The Dreamcast graphics chip has special hardware to decode it, so you suffer no performance loss for the decoding step when texture data is read by it for drawing. Textures will use less video memory.
Strided: a way to draw non-power-of-two textures by defining gaps
Mipmap: each texture stores smaller versions of itself (512x256 -> 256x128 -> 128x64..). Makes textures 1/3 bigger, texture filtering cheaper and improves rendering performance when textures are displayed at a smaller than original size.

There are also paletted texture formats which can be combined with the above color formats as well as ARGB8888.
4BPP_PAL (or 8BPP_PAL)
1. Define an array of 2^4=16 (or 2^8=256) color values.
2. Copy it to the graphics chip, then send textures that use 4-bit (or 8-bit) indices into that array instead of 16 bit argb as pixel values.
ARGB8888 is slow when used with filtering.
You can set multiple palettes on the graphics chip at the same time (1024 bytes total)
Palettes are always twiddled and never strided.
You can refer to this guide, although you don't have to stress over the PVR commands since you'd use OpenGL http://elysianshadows.com/updates/hardw ... -dreamcast

There are tools for conversion in the KOS utils folder, but I think the change for non-square VQ textures has not been added there. You can also try this tool which offers a preview as well https://github.com/tvspelsfreak/texconv

And check out this thread for size comparisons with VQ
viewtopic.php?f=29&t=103369&p=1045482
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Sat Jan 02, 2016 6:54 pm

@bogglez
Thanks for all the info.

I made a new git project:
https://github.com/rikto/dc_tri-strip_e ... lender.git

At the moment it doesn't export anything yet, just installs in blender correctly as an add-on and adds "Dreamcast .3dc" to the export list.
(I made up the extension, at this point it doesn't matter)

It's my first git project and blender add-on so will be a total hack job. I need to read up on how stuff works.
User avatar
Dev
DCEmu Fast Newbie
DCEmu Fast Newbie
Posts: 19
Joined: Mon Dec 21, 2015 7:35 am
Location: Cyberspace

Re: Using blender models in KGL (is jfm the best option?)

Post by Dev » Sun Jan 03, 2016 8:49 am

Did some more hacking around, managed to get vertex data out from selected object.

This:

Code: Select all

import bpy
import threading
import time
import logging

for item in bpy.data.objects:  

    logging.basicConfig(level=logging.DEBUG,format='(%(threadName)-10s) %(message)s',)
 
    logging.debug(item.name)

    if item.type == 'MESH':  
        vert_list = [vertex.co for vertex in item.data.vertices]  
        for vert in vert_list:  
            logging.debug(vert)
When ran from the Blender scripts window will spit out vertex data of the selected object.

The above is based on: http://blenderscripting.blogspot.ie/201 ... ertex.html
But I had to modify it because normal print doesn't work.

Output for default cube:

Code: Select all

(MainThread) Cube
(MainThread) <Vector (-1.0000, -1.0000, -1.0000)>
(MainThread) <Vector (-1.0000, -1.0000, 1.0000)>
(MainThread) <Vector (-1.0000, 1.0000, -1.0000)>
(MainThread) <Vector (-1.0000, 1.0000, 1.0000)>
(MainThread) <Vector (1.0000, -1.0000, -1.0000)>
(MainThread) <Vector (1.0000, -1.0000, 1.0000)>
(MainThread) <Vector (1.0000, 1.0000, -1.0000)>
(MainThread) <Vector (1.0000, 1.0000, 1.0000)>
I have to add the programmatic equivalent of CTRL+T which converts quads to tris (should be easy)

And then the question is how to hack this together into strips.
Found some more reading material: http://www.codercorner.com/Strips.htm

I see this can get hairy very quickly.

To simplify the problem I will not allow problematic non-manifold geometry. Have not pushed to Git yet.
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Sun Jan 03, 2016 12:05 pm

Dev wrote:Did some more hacking around, managed to get vertex data out from selected object.
I have to add the programmatic equivalent of CTRL+T which converts quads to tris (should be easy)

And then the question is how to hack this together into strips.
Found some more reading material: http://www.codercorner.com/Strips.htm

I see this can get hairy very quickly.

To simplify the problem I will not allow problematic non-manifold geometry. Have not pushed to Git yet.
I'd go with something simple that just works first. Worry about performance later, we need to do some measurements anyway.

An OpenGL library has to implement triangle->tristrip conversion like this: https://bitbucket.org/bogglez/libgl15/s ... draw.c-755
Line 777 copies the triangle vertex data over.
Line 789 then sets every third vertex to the end of the vertex strip.
If you do the same thing during your export once ("every triangle is a triangle strip"), you will already outperform this code, because you only have to do it once, instead of every frame.
You can try to find better tristrips in a later commit.

You can also make a single triangle strip, even if the triangles aren't connected. Just duplicate some of the vertices in order to create triangles which have no area, and move on to the next vertex in the strip.
I wrote an example for you: https://bitbucket.org/bogglez/libgl15/s ... ew-default
This should work with the libgl in kos as well.

Here is another example with texture coordinates.
https://bitbucket.org/bogglez/libgl15/s ... ew-default
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
bogglez
Moderator
Moderator
Posts: 576
Joined: Sun Apr 20, 2014 9:45 am

Re: Using blender models in KGL (is jfm the best option?)

Post by bogglez » Sun Jan 03, 2016 12:36 pm

bogglez wrote: You can also make a single triangle strip, even if the triangles aren't connected. Just duplicate some of the vertices in order to create triangles which have no area, and move on to the next vertex in the strip.
I wrote an example for you: https://bitbucket.org/bogglez/libgl15/s ... ew-default
This should work with the libgl in kos as well.
Cough.. yeah, about that. Turns out I found another bug in kos-libgl and I forgot it doesn't define glGetError.
glVertexPointer needs to be called after every new frame. This is non-standard OpenGL and I'll report it to Ph3nom. He also seems to use a non-standard winding order for backface culling, so just disable it. I'll discuss this with him.

Code: Select all

        glClearColor(0.1, 0.2, 0.4, 1);
-       glEnable(GL_CULL_FACE);
+//     glEnable(GL_CULL_FACE);
 
        glEnableClientState(GL_VERTEX_ARRAY);
-       glVertexPointer(3, GL_FLOAT, 0, vertexData);
 
        return 0;
 }
@@ -47,6 +46,7 @@ int initGL() {
 void draw() {
        glClear(GL_COLOR_BUFFER_BIT);
 
+       glVertexPointer(3, GL_FLOAT, 0, vertexData);
        glDrawArrays(GL_TRIANGLE_STRIP, 0, 9);
 }
 
@@ -65,11 +65,13 @@ int main(int argc, char **argv) {
 
                glutSwapBuffers();
 
+/*
                error = glGetError();
                if(error) {
                        printf("OpenGL error: %s\n", gluErrorString(error));
                        return -1;
                }
+*/
        }

Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
Post Reply