View Single Post
  #180  
Old 09-13-2012, 09:38 AM
PixelEngineer
Sarnak
 
Join Date: May 2011
Posts: 96
Default

Quote:
Originally Posted by PiB View Post
I was flipping Z before but not setting the winding order so that's why I was surprised when half of the scene disappeared when I tried to turn on face culling! It seems to work fine now.

I started this project by writing a Python script to load WLD data like zones, zone objects, characters, skeletons etc and import it to Blender as a kind of prototype. When this worked well enough I rewrote the code in C++ and used OpenGL. But I kept the scripts around which can be handy for debugging sometimes.

The more the merrier, I agree! I will upload this code to GitHub too so we can share the code and information.

There are some features you mentioned I haven't got around to do yet (properly supporting transparency, animated textures, minor things like sky box...). How is your work on animations/skeletons going? I am currently trying out different ideas for implementing lighting, it's not as straightforward as I thought it would be.
I am currently working on animations/skeletons. I have not run into any problems. It's just a matter of getting all of the fragments loaded in a way that they can be used quickly when rendering.

For transparency, you really need to use the BSP tree while rendering. I assume you could get away without it but it would be much more work. I render every visible surface that is in the PVS and frustum recursively going front to back to prevent overdraw. Every time I come across a batch of polygons that are transparent, I add the offset and information to my "transparency stack". I chose the stack because you need to render back to front with transparency and a stack is an ideal data structure given the order of entry while rendering front to back.

For animated textures, make sure you have created a texture data structure that supports numerous bitmaps. One of the unknowns in the 0x04 fragments is the millisecond delay between texture switching. Keep track of the time and if it goes over the amount in the delay, switch index of the bitmap you will use for that texture.

The skybox was a bit more tricky. Just picture someone walking around with a dome around their head. Clear the depth buffer and render as usual. I can elaborate on this if needed.

As for lighting, I have implemented just the zone lighting that was in the original zones. Instead of dynamic lighting or lightmaps like Quake 3, they simply shaded the polygons with the color of nearby lightsources at each vertex. They essentially "faked" the lighting.

I will work towards getting my code on github this week as well. Probably getting static player/NPC models loaded would be a good place for me to take a small break. Let me know if you have any other questions in the meantime.

Cheers!
Reply With Quote