Go Back   EQEmulator Home > EQEmulator Forums > General > General::General Discussion

General::General Discussion General discussion about EverQuest(tm), EQEMu, and related topics.
Do not post support topics here.

Reply
 
Thread Tools Display Modes
  #1  
Old 09-12-2012, 11:02 PM
PixelEngineer
Sarnak
 
Join Date: May 2011
Posts: 96
Default

The zones were originally written for rendering with DirectX and it differs from OpenGL on the direction of the Z axis. Scaling your model view projection matrix by -1 in the Z direction. Then you need to make the front face wind clockwise and cull the back face. That should do it.

Are you using Blender for viewing the models or are you using it as your graphics engine?

The more projects the merrier. If you are interested in helping, I will hustle and get this github. Regardless, the more information that is out there, the better.

Cheers
Reply With Quote
  #2  
Old 09-13-2012, 04:27 AM
PiB
Fire Beetle
 
Join Date: Aug 2012
Posts: 15
Default

I was flipping Z before but not setting the winding order so that's why I was surprised when half of the scene disappeared when I tried to turn on face culling! It seems to work fine now.

I started this project by writing a Python script to load WLD data like zones, zone objects, characters, skeletons etc and import it to Blender as a kind of prototype. When this worked well enough I rewrote the code in C++ and used OpenGL. But I kept the scripts around which can be handy for debugging sometimes.

The more the merrier, I agree! I will upload this code to GitHub too so we can share the code and information.

There are some features you mentioned I haven't got around to do yet (properly supporting transparency, animated textures, minor things like sky box...). How is your work on animations/skeletons going? I am currently trying out different ideas for implementing lighting, it's not as straightforward as I thought it would be.
Reply With Quote
  #3  
Old 09-13-2012, 09:38 AM
PixelEngineer
Sarnak
 
Join Date: May 2011
Posts: 96
Default

Quote:
Originally Posted by PiB View Post
I was flipping Z before but not setting the winding order so that's why I was surprised when half of the scene disappeared when I tried to turn on face culling! It seems to work fine now.

I started this project by writing a Python script to load WLD data like zones, zone objects, characters, skeletons etc and import it to Blender as a kind of prototype. When this worked well enough I rewrote the code in C++ and used OpenGL. But I kept the scripts around which can be handy for debugging sometimes.

The more the merrier, I agree! I will upload this code to GitHub too so we can share the code and information.

There are some features you mentioned I haven't got around to do yet (properly supporting transparency, animated textures, minor things like sky box...). How is your work on animations/skeletons going? I am currently trying out different ideas for implementing lighting, it's not as straightforward as I thought it would be.
I am currently working on animations/skeletons. I have not run into any problems. It's just a matter of getting all of the fragments loaded in a way that they can be used quickly when rendering.

For transparency, you really need to use the BSP tree while rendering. I assume you could get away without it but it would be much more work. I render every visible surface that is in the PVS and frustum recursively going front to back to prevent overdraw. Every time I come across a batch of polygons that are transparent, I add the offset and information to my "transparency stack". I chose the stack because you need to render back to front with transparency and a stack is an ideal data structure given the order of entry while rendering front to back.

For animated textures, make sure you have created a texture data structure that supports numerous bitmaps. One of the unknowns in the 0x04 fragments is the millisecond delay between texture switching. Keep track of the time and if it goes over the amount in the delay, switch index of the bitmap you will use for that texture.

The skybox was a bit more tricky. Just picture someone walking around with a dome around their head. Clear the depth buffer and render as usual. I can elaborate on this if needed.

As for lighting, I have implemented just the zone lighting that was in the original zones. Instead of dynamic lighting or lightmaps like Quake 3, they simply shaded the polygons with the color of nearby lightsources at each vertex. They essentially "faked" the lighting.

I will work towards getting my code on github this week as well. Probably getting static player/NPC models loaded would be a good place for me to take a small break. Let me know if you have any other questions in the meantime.

Cheers!
Reply With Quote
  #4  
Old 09-14-2012, 04:26 AM
PiB
Fire Beetle
 
Join Date: Aug 2012
Posts: 15
Default

Quote:
Originally Posted by PixelEngineer View Post
I am currently working on animations/skeletons. I have not run into any problems. It's just a matter of getting all of the fragments loaded in a way that they can be used quickly when rendering.
I think one issue you will probably run into is that many characters have few or no animations you can load through 0x14 fragments. I am pretty sure the reason was to save space instead of duplicating animations. Many characters seem to share animations. For examples, barbarians, dark/high/half elves, erudites and humans all use wood elf animations with some additions (see video). Same for dragons and quite a lot of mobs. I have made a list of the most common vanilla/Kunark/Velious characters I could find and which animations they use.

Quote:
Originally Posted by PixelEngineer View Post
For transparency, you really need to use the BSP tree while rendering. I assume you could get away without it but it would be much more work. I render every visible surface that is in the PVS and frustum recursively going front to back to prevent overdraw. Every time I come across a batch of polygons that are transparent, I add the offset and information to my "transparency stack". I chose the stack because you need to render back to front with transparency and a stack is an ideal data structure given the order of entry while rendering front to back.
How do you sort from front to back, do you do it per object/region using its AABB or per face? Or can you traverse the BSP front-to-back somehow? I thought the division between planes were arbitrary. Anyway that's one more thing I have to implement, right now I'm using an octree and frustum culling for this. I guess this is not the most efficient. But it will probably come in handy for keeping track of characters. One thing I was wondering, isn't the usefulness of the PVS limited in outdoor zones like the Karanas where you can see from very far away? Obviously I'm sure this works pretty well in dungeons.

Quote:
Originally Posted by PixelEngineer View Post
For animated textures, make sure you have created a texture data structure that supports numerous bitmaps. One of the unknowns in the 0x04 fragments is the millisecond delay between texture switching. Keep track of the time and if it goes over the amount in the delay, switch index of the bitmap you will use for that texture.
I am using 2D texture arrays for textures so that I can draw static objects (and eventually characters) in one draw call. This should make it straightforward to have animated textures. Each vertex has a third texture coordinate for the layer in the array. I plan to add a kind of offset table in the vertex shader so that you can specify alternate textures (if animated, or to have different kind of equipments) without changing texture bindings.

Quote:
Originally Posted by PixelEngineer View Post
The skybox was a bit more tricky. Just picture someone walking around with a dome around their head. Clear the depth buffer and render as usual. I can elaborate on this if needed.
How did you determine the scale of the dome? Do you use some kind of scaling factor that you multiply with the width/length of the zone?

Quote:
Originally Posted by PixelEngineer View Post
As for lighting, I have implemented just the zone lighting that was in the original zones. Instead of dynamic lighting or lightmaps like Quake 3, they simply shaded the polygons with the color of nearby lightsources at each vertex. They essentially "faked" the lighting.
I think I have tried something similar. I started with determining which lights affect an object and send the array of lights used in the vertex shader for the object. This approach didn't scale very well with a lot of lights. I tried to compute the per-vertex lighting once, when loading the zone files. Then I didn't have to do any lighting in shaders but the result was quite ugly (since it's done per-vertex and old zones have very large polygons). I will try deferred shading next (so I can do per-fragment lighting in one pass) but I think this will be quite a lot of work.

Quote:
Originally Posted by PixelEngineer View Post
I will work towards getting my code on github this week as well. Probably getting static player/NPC models loaded would be a good place for me to take a small break. Let me know if you have any other questions in the meantime
Sound good, keep up the good work!
Reply With Quote
  #5  
Old 09-14-2012, 09:39 PM
PixelEngineer
Sarnak
 
Join Date: May 2011
Posts: 96
Default

Quote:
Originally Posted by PiB View Post
I think one issue you will probably run into is that many characters have few or no animations you can load through 0x14 fragments. I am pretty sure the reason was to save space instead of duplicating animations. Many characters seem to share animations. For examples, barbarians, dark/high/half elves, erudites and humans all use wood elf animations with some additions (see video). Same for dragons and quite a lot of mobs. I have made a list of the most common vanilla/Kunark/Velious characters I could find and which animations they use.
Is that your program? Nice work. Do you have a copy of the source?

Quote:
Originally Posted by PiB View Post
How do you sort from front to back, do you do it per object/region using its AABB or per face? Or can you traverse the BSP front-to-back somehow? I thought the division between planes were arbitrary. Anyway that's one more thing I have to implement, right now I'm using an octree and frustum culling for this. I guess this is not the most efficient. But it will probably come in handy for keeping track of characters. One thing I was wondering, isn't the usefulness of the PVS limited in outdoor zones like the Karanas where you can see from very far away? Obviously I'm sure this works pretty well in dungeons.
The BSP tree has done pretty much all of the work for you. A BSP tree is made up of arbitrarily sized regions divided by split planes. Things in front of each split plane are found on the left side of each tree node and things behind it will be found on the right. The way to correctly render front to back is to recursively iterate the tree visiting the child nodes the camera is in front of first.

In terms of rendering transparency back to front, as I mentioned, I use a stack. It holds the offset in my VBO as well as the number of polygons. Because a stack is a last in first out data structure when I render front to back the polygon batches that go in come out last.

Here is some tree traversal code demonstrating what happens:

Code:
    // Calculate the distance to the split plane
    float distance = (camera.getX() * tree[node].normal[0]) 
+ (camera.getY() * tree[node].normal[1]) + (camera.getZ() * tree[node].normal[2]) + tree[node].splitdistance;

    // We are not at a leaf
    if (distance > 0) 
    {
        renderGeometry(cameraMat, tree[node].left, curRegion);
        renderGeometry(cameraMat, tree[node].right, curRegion);
    } 
    else 
    {
        renderGeometry(cameraMat, tree[node].right, curRegion);
        renderGeometry(cameraMat, tree[node].left, curRegion);
    }
I suppose you can use an octree but you will still have to take the BSP tree into consideration for information about region types (water, lava, PvP) and if you want to use the PVS. It is true that the PVS is pretty inefficient at reducing a lot of the regions you can't see but it's a very inexpensive check to do.

Quote:
Originally Posted by PiB View Post
How did you determine the scale of the dome? Do you use some kind of scaling factor that you multiply with the width/length of the zone?
I think you are misunderstanding what skydomes really are. EverQuest's skydomes are small half spheres that translate and rotate with the camera. They are drawn first and give the impression that the sky is very large despite it being exactly the opposite. Picture someone walking around with a sky textured bowl on their head. This is essentially the idea and because it moves with them, it gives the illusion that the sky is vastly infinite. If you were to stretch the skybox over the entire zone and walk the distance, you could notice approaching the edge of the diameter and it would look a bit weird.

The first thing you should render is the skydome. Then, clear the depth buffer. Because the skydome is inches away from the camera if you didn't clear the depth buffer, none of the zone would render because it's all further away than the skydome actually is. After clearing the depth buffer, render as you usually do. It will give the illusions that behind everything rendered is a vast sky.

Quote:
Originally Posted by PiB View Post
I think I have tried something similar. I started with determining which lights affect an object and send the array of lights used in the vertex shader for the object. This approach didn't scale very well with a lot of lights. I tried to compute the per-vertex lighting once, when loading the zone files. Then I didn't have to do any lighting in shaders but the result was quite ugly (since it's done per-vertex and old zones have very large polygons). I will try deferred shading next (so I can do per-fragment lighting in one pass) but I think this will be quite a lot of work.
First, determine what your goal is. My goal is have the zones rendering as close to classic EverQuest as possible. The original lighting was simply precomputed vertex colors which are blended with the textures to give the appearance of lighting. Objects also have vertex colors as the EverQuest client did not dynamically shade objects. I assume the lights.wld contains lighting details just for shading player and mob models.

After I have everything rendering, I will move on to per pixel lighting. You are correct that per pixel lighting with the provided surface normals will not look good at all. You really need to be utilizing normal maps for any surface that is rendered with phong shading.
Reply With Quote
  #6  
Old 09-15-2012, 08:38 PM
PiB
Fire Beetle
 
Join Date: Aug 2012
Posts: 15
Default

Quote:
Originally Posted by PixelEngineer View Post
Is that your program? Nice work. Do you have a copy of the source?
It is. Thanks! I pushed my local Git repo to Github if you want to take a look. This should build on both Windows and Linux (haven't tested MacOS X) using CMake. I really should add a README with build instructions.

Quote:
Originally Posted by PixelEngineer View Post
The BSP tree has done pretty much all of the work for you. A BSP tree is made up of arbitrarily sized regions divided by split planes. Things in front of each split plane are found on the left side of each tree node and things behind it will be found on the right. The way to correctly render front to back is to recursively iterate the tree visiting the child nodes the camera is in front of first.
I didn't know that the planes in the BSP tree always split the space into front / back half-spaces. Thanks for the clarification.

Quote:
Originally Posted by PixelEngineer View Post
In terms of rendering transparency back to front, as I mentioned, I use a stack. It holds the offset in my VBO as well as the number of polygons. Because a stack is a last in first out data structure when I render front to back the polygon batches that go in come out last.

Here is some tree traversal code demonstrating what happens:

...

I suppose you can use an octree but you will still have to take the BSP tree into consideration for information about region types (water, lava, PvP) and if you want to use the PVS. It is true that the PVS is pretty inefficient at reducing a lot of the regions you can't see but it's a very inexpensive check to do.
Yeah, I'm working on using this BSP tree now. This should be faster to traverse than my octree and I need it to determine the ambient light of the region anyway (even though many zones seem to have the same ambient light in all regions). The traversal code looks really straightforward, thanks.

Quote:
Originally Posted by PixelEngineer View Post
I think you are misunderstanding what skydomes really are. EverQuest's skydomes are small half spheres that translate and rotate with the camera. They are drawn first and give the impression that the sky is very large despite it being exactly the opposite. Picture someone walking around with a sky textured bowl on their head. This is essentially the idea and because it moves with them, it gives the illusion that the sky is vastly infinite. If you were to stretch the skybox over the entire zone and walk the distance, you could notice approaching the edge of the diameter and it would look a bit weird.

The first thing you should render is the skydome. Then, clear the depth buffer. Because the skydome is inches away from the camera if you didn't clear the depth buffer, none of the zone would render because it's all further away than the skydome actually is. After clearing the depth buffer, render as you usually do. It will give the illusions that behind everything rendered is a vast sky.
Ah, that makes sense. I think I understand now. Does this mean you render the sky box in camera space?
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

   

All times are GMT -4. The time now is 06:31 PM.


 

Everquest is a registered trademark of Daybreak Game Company LLC.
EQEmulator is not associated or affiliated in any way with Daybreak Game Company LLC.
Except where otherwise noted, this site is licensed under a Creative Commons License.
       
Powered by vBulletin®, Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Template by Bluepearl Design and vBulletin Templates - Ver3.3