Invalid texture U size

If you have any questions on programming, this is the place to ask them, whether you're a newbie or an experienced programmer. Discussion on programming in general is also welcome. We will help you with programming homework, but we will not do your work for you! Any porting requests must be made in Developmental Ideas.
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
https://www.artistsworkshop.eu/meble-kuchenne-na-wymiar-warszawa-gdzie-zamowic/
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Invalid texture U size

Post by kazade »

I'm getting the following error in my app:

Code: Select all

*** ASSERTION FAILURE ***
Assertion "0" failed at pvr_prim.c:98 in `pvr_poly_compile': Invalid texture U size
I've checked and the texture that is bound is a power-of-two in width and height and the same GL code works perfectly fine on the PC. I'm not really sure how to debug this, any ideas?
User avatar
BlueCrab
The Crabby Overlord
The Crabby Overlord
Posts: 5658
Joined: Mon May 27, 2002 11:31 am
Location: Sailing the Skies of Arcadia
Has thanked: 9 times
Been thanked: 69 times
Contact:

Re: Invalid texture U size

Post by BlueCrab »

Texture sizes must be powers-of-two on the Dreamcast. If you're getting that error, that implies you have something that is not a power-of-two for it's width.
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

I've logged all glTexImage2D calls, and they are always powers of two (8, 256 or 512)

Any other ideas?

EDIT: Here's my log output:

Code: Select all

DEBUG 0x8c4a5870: Loaded /cd/sample_data/crate.png into GL texture 5 (/simulant/simulant/texture.cpp:107)
DEBUG 0x8c4a5870: Texture 5 has dimensions, W:256 H:256 (/simulant/simulant/texture.cpp:134)
DEBUG 0x8c4a5870: Texture uploaded (/simulant/simulant/texture.cpp:191)
DEBUG 0x8c4a5870: Binding texture 5 to texture unit 0 (/simulant/simulant/renderers/gl1x/gl1x_render_queue_visitor.cpp:82)
DEBUG 0x8c4a5870: Disabling texture unit 1 (/simulant/simulant/renderers/gl1x/gl1x_render_queue_visitor.cpp:86)
DEBUG 0x8c4a5870: Allocating HW buffer of size 1344 (/simulant/simulant/renderers/gl1x/gl1x_buffer_manager.cpp:19)
DEBUG 0x8c4a5870: Allocating HW buffer of size 72 (/simulant/simulant/renderers/gl1x/gl1x_buffer_manager.cpp:19)

*** ASSERTION FAILURE ***
Assertion "0" failed at pvr_prim.c:98 in `pvr_poly_compile': Invalid texture U size

arch: shutting down kernel
maple: final stats -- device count = 1, vbl_cntr = 139, dma_cntr = 139
vid_set_mode: 640x480IL NTSC
Here's how I'm binding the textures (GLCheck is just a template function which calls glGetError after the call, you can see my logging statements):

Code: Select all

    for(uint32_t i = 0; i < MAX_TEXTURE_UNITS; ++i) {
        auto current_tex = current_group_->texture_id[i];
        if(!last_group || last_group->texture_id[i] != current_tex) {
            GLCheck(glActiveTextureARB, GL_TEXTURE0_ARB + i);
            if(current_tex) {
                L_DEBUG(_F("Binding texture {0} to texture unit {1}").format(current_tex, i));
                GLCheck(glEnable, GL_TEXTURE_2D);
                GLCheck(glBindTexture, GL_TEXTURE_2D, current_tex);
            } else {
                L_DEBUG(_F("Disabling texture unit {0}").format(i));
                GLCheck(glBindTexture, GL_TEXTURE_2D, 0);
                GLCheck(glDisable, GL_TEXTURE_2D);
            }
        }
    }
Here's the upload call, again, just to show the logging. internalFormat and format are both GL_RGBA.

Code: Select all

    L_DEBUG(_F("Texture {0} has dimensions, W:{1} H:{2}").format(gl_tex_, width_, height_));
    GLCheck(glTexImage2D,
        GL_TEXTURE_2D,
        0, internalFormat,
        width_, height_, 0,
        format,
        GL_UNSIGNED_BYTE, &data_[0]
    );
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

Weirdly, if I disable calls to glTexCoordPointer, then the crash goes away (the texture isn't rendered though) so maybe it's just that.

However, I'm wondering if this ties into my other thread about the 'stride' parameter not working properly, which I believe will cause a buffer-overrun in this case which may be corrupting things. I'll try and see if I can fix up stride in libGL...
User avatar
bogglez
Moderator
Moderator
Posts: 578
Joined: Sun Apr 20, 2014 9:45 am
Has thanked: 0
Been thanked: 0

Re: Invalid texture U size

Post by bogglez »

I think somebody reported a similar issue long, long ago and never replied to me.. X_x
Can you try glEnable for textures before and after some of those OpenGL calls, especially the draw call and possibly the glBindTexture/glTexCoord calls? I think there was some bad state.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

I just tried, but no amount of glEnable(GL_TEXTURE_2D) seems to change anything.

I'm going to try to fix up my other problem with the stride in libgl and see if that fixes it.
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

OK, so I've made some progress in my branch here: https://github.com/Kazade/libgl/commits/master

Firstly I've fixed up the stride functionality for vertex, colour and texcoord data. I've changed the code to store stride directly as number of bytes, and stored all the data points as GLubyte* so all pointer arithmetic works with bytes up to the point we need to read it where it's cast to the appropriate type. If I disable multitexturing completely I now have a correctly coloured and textured spinning cube.

I've noticed a couple of issues with multi texturing:

1. If you enable texture unit 1, and call glTexCoordPointer, there doesn't seem to be any way to undo that (e.g. if you draw one object with 2 textures, you can't draw another with 1)
2. If you call glBindTexture(GL_TEXTURE_2D, 0) with glActiveTexture(GL_TEXTURE1), and you've previously called glTexCoordPointer with glClientActiveTexture(GL_TEXTURE1).. you end up with the crash

So, this commit "fixes" it https://github.com/Kazade/libgl/commit/ ... 12447682f0

But that causes all texturing to stop working (but it does prevent the crash) - I'm just trying to work out why texture 0 isn't working anymore.

One other thing I've noticed, back-face culling seems to be reversed? I'm using the same renderer on PC and DC and I can see the inside of the cube on the Dreamcast and the outside of the cube on the PC. Is that a known thing?
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

I think a lot of this can be fixed by implementing glEnableClientState/glDisableClientState so that sending of the second set of texture coordinates can be disabled. I also noticed a few other issues:

1. glDrawArrays affects the buffer pointers, but doesn't restore them (so 2 sequential glDrawArrays calls will break)
2. Something seems really odd about the _glKosArraysApplyClipping call in glDrawArrays... it uses the second texture coordinate array but doesn't check that it's enabled
3. glDrawArrays 'first' argument doesn't take into account stride at all
4. GL_KOS_FACE_FRONT is too small to store GL_CCW or GL_CW so it always gets truncated and glFrontFace doesn't work
Last edited by kazade on Wed May 31, 2017 1:37 pm, edited 1 time in total.
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

OK, one more thought about the crashing issue. According to the glBindTexture documentation...

Code: Select all

The value zero is reserved to represent the default texture for each texture target.
That is, it doesn't mean texturing is disabled, or no texture is bound, it simply replaces the bound texture with the default (which normally I guess is a 1x1 white texture or something). Perhaps one better way to deal with this is if texturing is enabled for a texture unit, but no texture is bound, to use a default texture (what's the smallest texture we can use? 8x8?). That would avoid the crash.

I'll keep hacking at this, unless someone tells me to stop :p
User avatar
bogglez
Moderator
Moderator
Posts: 578
Joined: Sun Apr 20, 2014 9:45 am
Has thanked: 0
Been thanked: 0

Re: Invalid texture U size

Post by bogglez »

Feel free to keep hacking on it! :o)
Yeah the reversed clipping is definitely a bug. Again somebody reported it to me and wanted to send a patch but didn't. Sorry I'm so bad at keeping track of this.. there's a libgl bugs thread buried in here which I didn't keep up to date.
BTW we usually accept patches as .patch files but pull requests will also work if you dislike that
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

Is squashing into a single patch OK? I'm not sure each commit applies and works individually...

I've fixed multitexturing, and backface culling, and a bunch of other things :)
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

OK, so I've attached the patches in a zip file. There are two folders: commits and squashed - it's up to you how you apply them :)

Here's what I've fixed:
  • Fixed stride param for glVertexPointer, glColorPointer, glTexcoordPointer and glNormalPointer. Pointers are now stored as GLubyte* and only converted to the destination type after offsets have been applied
    Fixed the "first" parameter of glDrawArrays to take stride into account
    Fixed glKosArraysTransformPositions to use stride rather than just assuming 3 floats
    Fixed a crash if texture 1 coord array is enabled, but no texture is bound to it
    Fixed backface culling so glFrontFace/glCullFace work as expected and the default is correct
    Implemented glEnableClientState and glDisableClientState
    Fix a bug where no texture would be rendered if GL_TEXTURE0_ARB wasn't the active client texture when drawing
There are still a number of bugs that I know of:
  • Calling glDrawArrays twice in a row without re-calling glXPointer will break
    The call to glKosArraysApplyClipping in glDrawArrays seems wrong
    Stride is not correctly handled within glKosArraysApplyClipping and the functions it calls, there's a fixme comment there
Also you can probably reduce the amount of code by merging the two index pointers into a single one, and instead pass down an index-byte-size to all the functions, and always cast the index to a short before using it. Actually I think it's also possible that glDrawElements and glDrawArrays could both call the same function (glDrawArrays case use i as the index, glDrawElements case use indices).
Attachments
patchset.zip
(20.14 KiB) Downloaded 76 times
User avatar
bogglez
Moderator
Moderator
Posts: 578
Joined: Sun Apr 20, 2014 9:45 am
Has thanked: 0
Been thanked: 0

Re: Invalid texture U size

Post by bogglez »

Thanks for the patch set, kazade!

I think I'll have some free time tomorrow to look at these patches, if BlueCrab doesn't want to do it. :)
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

Cool that would be great, it's probably worth mentioning I haven't tested glDrawArrays, I use glDrawElements exclusively in my game engine, so it's probably worth testing that. Also note that now glEnableClientState is a thing that needs to be called (so now it behaves like GL defines) so you might need to change your code slightly if you're not doing that.
User avatar
bogglez
Moderator
Moderator
Posts: 578
Joined: Sun Apr 20, 2014 9:45 am
Has thanked: 0
Been thanked: 0

Re: Invalid texture U size

Post by bogglez »

If that makes it more OpenGL conformant then I will allow breakage in existing code. I'll probably have to fix some of the OpenGL examples, would be nice if you could check them too.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
BlueCrab
The Crabby Overlord
The Crabby Overlord
Posts: 5658
Joined: Mon May 27, 2002 11:31 am
Location: Sailing the Skies of Arcadia
Has thanked: 9 times
Been thanked: 69 times
Contact:

Re: Invalid texture U size

Post by BlueCrab »

bogglez wrote:Thanks for the patch set, kazade!

I think I'll have some free time tomorrow to look at these patches, if BlueCrab doesn't want to do it. :)
Take a look and give your opinion. I'll take a look when I get a chance (I'm away for the next few days) and give any final approval to it.

That said, I really don't like breaking old code that did work previously (unless that code specifically assumed buggy behavior), if it's at all possible to avoid it. What I mean by that is that if you're saying that this patchset would break previously working, compliant code, that's a big problem.
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

To clarify, this won't break any code which is using OpenGL correctly, it would only break code if it is abusing the fact that until now glEnableClientState and glDisableClientState didn't do anything, by just not calling them.

The reason we need glEnableClientState and glDisableClientState to work is that before my patch set, if you wanted to draw an object with textures, and then draw another with just colours, there was no way to disable sending of texture coordinates (for example), which could cause buffer overruns if the second object had more indices as it would try to send texture coordinates for the previous object.
User avatar
bogglez
Moderator
Moderator
Posts: 578
Joined: Sun Apr 20, 2014 9:45 am
Has thanked: 0
Been thanked: 0

Re: Invalid texture U size

Post by bogglez »

This is the same breakage that occurred when the old libgl was replaced with the new libgl.. more non-standard behavior was corrected.
The new libgl still has some major compatibility problems that people get annoyed with on a daily basis.
There's really no point in calling it libgl if it doesn't implement the OpenGL specification.
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
User avatar
bogglez
Moderator
Moderator
Posts: 578
Joined: Sun Apr 20, 2014 9:45 am
Has thanked: 0
Been thanked: 0

Re: Invalid texture U size

Post by bogglez »

I had a cursory look.

The nehe examples still work the same.

Looking at the code:

- Why is GL_KOS_CULL_FUNC short not byte now? Values are between 0 and 3
- sizeof(GLdouble) will return sizeof(float), not sizeof(double) due to gcc configure flags for KOS. It's not safe to assume that sizeof(GLdouble) will be 4 and this may lead to wrong offset calculations. Either we support double properly or we need to have to handle this explicitly (and convert from double to float in the code).
- don't cast from GLvoid* to GLubyte* explicitly in C, you could hide errors in the future.
- _glKosArraysTransform*:
- unsafe to read from unaligned address (now possible due to GLubyte*), e.g. float at address 0x0001 instead of 0x0004 should throw a bad read error, I think?
- dest pointer is now always the same, instead of looping through a dest array?
Wiki & tutorials: http://dcemulation.org/?title=Development
Wiki feedback: viewtopic.php?f=29&t=103940
My libgl playground (not for production): https://bitbucket.org/bogglez/libgl15
My lxdream fork (with small fixes): https://bitbucket.org/bogglez/lxdream
kazade
Insane DCEmu
Insane DCEmu
Posts: 145
Joined: Tue May 02, 2017 3:11 pm
Has thanked: 3 times
Been thanked: 34 times

Re: Invalid texture U size

Post by kazade »

Thanks for the feedback, comments below!
bogglez wrote:I had a cursory look.

The nehe examples still work the same.

Looking at the code:

- Why is GL_KOS_CULL_FUNC short not byte now? Values are between 0 and 3
OK, I think actually it should be GLenum and initialized to zero, because aside from initialization it's always used to store a GLenum (which was being truncated, that was the bug)
- sizeof(GLdouble) will return sizeof(float), not sizeof(double) due to gcc configure flags for KOS. It's not safe to assume that sizeof(GLdouble) will be 4 and this may lead to wrong offset calculations. Either we support double properly or we need to have to handle this explicitly (and convert from double to float in the code).
I think everywhere GL_DOUBLE could be passed is already being checked, I only added it for completeness - but actually you're probably right, if we can't support it it would probably be safer to remove it from the case statement.
- don't cast from GLvoid* to GLubyte* explicitly in C, you could hide errors in the future.
Sorry, I didn't realize.. the original code was casting to GLfloat*, I just changed it to GLubyte*
- _glKosArraysTransform*:
- unsafe to read from unaligned address (now possible due to GLubyte*), e.g. float at address 0x0001 instead of 0x0004 should throw a bad read error, I think?
- dest pointer is now always the same, instead of looping through a dest array?
Isn't that only true if someone passes an invalid stride? I don't see we have any other choice as the same buffer can be used for glVertexPointer and glColorPointer (for example) with float data and unsigned bytes.

I can't see where dest pointer is always the same? Which lines?
Post Reply