|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
General::General Discussion General discussion about EverQuest(tm), EQEMu, and related topics. Do not post support topics here. |
|
|
|
09-14-2012, 04:26 AM
|
Fire Beetle
|
|
Join Date: Aug 2012
Posts: 15
|
|
Quote:
Originally Posted by PixelEngineer
I am currently working on animations/skeletons. I have not run into any problems. It's just a matter of getting all of the fragments loaded in a way that they can be used quickly when rendering.
|
I think one issue you will probably run into is that many characters have few or no animations you can load through 0x14 fragments. I am pretty sure the reason was to save space instead of duplicating animations. Many characters seem to share animations. For examples, barbarians, dark/high/half elves, erudites and humans all use wood elf animations with some additions (see video). Same for dragons and quite a lot of mobs. I have made a list of the most common vanilla/Kunark/Velious characters I could find and which animations they use.
Quote:
Originally Posted by PixelEngineer
For transparency, you really need to use the BSP tree while rendering. I assume you could get away without it but it would be much more work. I render every visible surface that is in the PVS and frustum recursively going front to back to prevent overdraw. Every time I come across a batch of polygons that are transparent, I add the offset and information to my "transparency stack". I chose the stack because you need to render back to front with transparency and a stack is an ideal data structure given the order of entry while rendering front to back.
|
How do you sort from front to back, do you do it per object/region using its AABB or per face? Or can you traverse the BSP front-to-back somehow? I thought the division between planes were arbitrary. Anyway that's one more thing I have to implement, right now I'm using an octree and frustum culling for this. I guess this is not the most efficient. But it will probably come in handy for keeping track of characters. One thing I was wondering, isn't the usefulness of the PVS limited in outdoor zones like the Karanas where you can see from very far away? Obviously I'm sure this works pretty well in dungeons.
Quote:
Originally Posted by PixelEngineer
For animated textures, make sure you have created a texture data structure that supports numerous bitmaps. One of the unknowns in the 0x04 fragments is the millisecond delay between texture switching. Keep track of the time and if it goes over the amount in the delay, switch index of the bitmap you will use for that texture.
|
I am using 2D texture arrays for textures so that I can draw static objects (and eventually characters) in one draw call. This should make it straightforward to have animated textures. Each vertex has a third texture coordinate for the layer in the array. I plan to add a kind of offset table in the vertex shader so that you can specify alternate textures (if animated, or to have different kind of equipments) without changing texture bindings.
Quote:
Originally Posted by PixelEngineer
The skybox was a bit more tricky. Just picture someone walking around with a dome around their head. Clear the depth buffer and render as usual. I can elaborate on this if needed.
|
How did you determine the scale of the dome? Do you use some kind of scaling factor that you multiply with the width/length of the zone?
Quote:
Originally Posted by PixelEngineer
As for lighting, I have implemented just the zone lighting that was in the original zones. Instead of dynamic lighting or lightmaps like Quake 3, they simply shaded the polygons with the color of nearby lightsources at each vertex. They essentially "faked" the lighting.
|
I think I have tried something similar. I started with determining which lights affect an object and send the array of lights used in the vertex shader for the object. This approach didn't scale very well with a lot of lights. I tried to compute the per-vertex lighting once, when loading the zone files. Then I didn't have to do any lighting in shaders but the result was quite ugly (since it's done per-vertex and old zones have very large polygons). I will try deferred shading next (so I can do per-fragment lighting in one pass) but I think this will be quite a lot of work.
Quote:
Originally Posted by PixelEngineer
I will work towards getting my code on github this week as well. Probably getting static player/NPC models loaded would be a good place for me to take a small break. Let me know if you have any other questions in the meantime
|
Sound good, keep up the good work!
|
|
|
|
|
|
|
09-14-2012, 09:39 PM
|
Sarnak
|
|
Join Date: May 2011
Posts: 96
|
|
Quote:
Originally Posted by PiB
I think one issue you will probably run into is that many characters have few or no animations you can load through 0x14 fragments. I am pretty sure the reason was to save space instead of duplicating animations. Many characters seem to share animations. For examples, barbarians, dark/high/half elves, erudites and humans all use wood elf animations with some additions (see video). Same for dragons and quite a lot of mobs. I have made a list of the most common vanilla/Kunark/Velious characters I could find and which animations they use.
|
Is that your program? Nice work. Do you have a copy of the source?
Quote:
Originally Posted by PiB
How do you sort from front to back, do you do it per object/region using its AABB or per face? Or can you traverse the BSP front-to-back somehow? I thought the division between planes were arbitrary. Anyway that's one more thing I have to implement, right now I'm using an octree and frustum culling for this. I guess this is not the most efficient. But it will probably come in handy for keeping track of characters. One thing I was wondering, isn't the usefulness of the PVS limited in outdoor zones like the Karanas where you can see from very far away? Obviously I'm sure this works pretty well in dungeons.
|
The BSP tree has done pretty much all of the work for you. A BSP tree is made up of arbitrarily sized regions divided by split planes. Things in front of each split plane are found on the left side of each tree node and things behind it will be found on the right. The way to correctly render front to back is to recursively iterate the tree visiting the child nodes the camera is in front of first.
In terms of rendering transparency back to front, as I mentioned, I use a stack. It holds the offset in my VBO as well as the number of polygons. Because a stack is a last in first out data structure when I render front to back the polygon batches that go in come out last.
Here is some tree traversal code demonstrating what happens:
Code:
// Calculate the distance to the split plane
float distance = (camera.getX() * tree[node].normal[0])
+ (camera.getY() * tree[node].normal[1]) + (camera.getZ() * tree[node].normal[2]) + tree[node].splitdistance;
// We are not at a leaf
if (distance > 0)
{
renderGeometry(cameraMat, tree[node].left, curRegion);
renderGeometry(cameraMat, tree[node].right, curRegion);
}
else
{
renderGeometry(cameraMat, tree[node].right, curRegion);
renderGeometry(cameraMat, tree[node].left, curRegion);
}
I suppose you can use an octree but you will still have to take the BSP tree into consideration for information about region types (water, lava, PvP) and if you want to use the PVS. It is true that the PVS is pretty inefficient at reducing a lot of the regions you can't see but it's a very inexpensive check to do.
Quote:
Originally Posted by PiB
How did you determine the scale of the dome? Do you use some kind of scaling factor that you multiply with the width/length of the zone?
|
I think you are misunderstanding what skydomes really are. EverQuest's skydomes are small half spheres that translate and rotate with the camera. They are drawn first and give the impression that the sky is very large despite it being exactly the opposite. Picture someone walking around with a sky textured bowl on their head. This is essentially the idea and because it moves with them, it gives the illusion that the sky is vastly infinite. If you were to stretch the skybox over the entire zone and walk the distance, you could notice approaching the edge of the diameter and it would look a bit weird.
The first thing you should render is the skydome. Then, clear the depth buffer. Because the skydome is inches away from the camera if you didn't clear the depth buffer, none of the zone would render because it's all further away than the skydome actually is. After clearing the depth buffer, render as you usually do. It will give the illusions that behind everything rendered is a vast sky.
Quote:
Originally Posted by PiB
I think I have tried something similar. I started with determining which lights affect an object and send the array of lights used in the vertex shader for the object. This approach didn't scale very well with a lot of lights. I tried to compute the per-vertex lighting once, when loading the zone files. Then I didn't have to do any lighting in shaders but the result was quite ugly (since it's done per-vertex and old zones have very large polygons). I will try deferred shading next (so I can do per-fragment lighting in one pass) but I think this will be quite a lot of work.
|
First, determine what your goal is. My goal is have the zones rendering as close to classic EverQuest as possible. The original lighting was simply precomputed vertex colors which are blended with the textures to give the appearance of lighting. Objects also have vertex colors as the EverQuest client did not dynamically shade objects. I assume the lights.wld contains lighting details just for shading player and mob models.
After I have everything rendering, I will move on to per pixel lighting. You are correct that per pixel lighting with the provided surface normals will not look good at all. You really need to be utilizing normal maps for any surface that is rendered with phong shading.
|
|
|
|
|
|
|
09-15-2012, 08:38 PM
|
Fire Beetle
|
|
Join Date: Aug 2012
Posts: 15
|
|
Quote:
Originally Posted by PixelEngineer
Is that your program? Nice work. Do you have a copy of the source?
|
It is. Thanks! I pushed my local Git repo to Github if you want to take a look. This should build on both Windows and Linux (haven't tested MacOS X) using CMake. I really should add a README with build instructions.
Quote:
Originally Posted by PixelEngineer
The BSP tree has done pretty much all of the work for you. A BSP tree is made up of arbitrarily sized regions divided by split planes. Things in front of each split plane are found on the left side of each tree node and things behind it will be found on the right. The way to correctly render front to back is to recursively iterate the tree visiting the child nodes the camera is in front of first.
|
I didn't know that the planes in the BSP tree always split the space into front / back half-spaces. Thanks for the clarification.
Quote:
Originally Posted by PixelEngineer
In terms of rendering transparency back to front, as I mentioned, I use a stack. It holds the offset in my VBO as well as the number of polygons. Because a stack is a last in first out data structure when I render front to back the polygon batches that go in come out last.
Here is some tree traversal code demonstrating what happens:
...
I suppose you can use an octree but you will still have to take the BSP tree into consideration for information about region types (water, lava, PvP) and if you want to use the PVS. It is true that the PVS is pretty inefficient at reducing a lot of the regions you can't see but it's a very inexpensive check to do.
|
Yeah, I'm working on using this BSP tree now. This should be faster to traverse than my octree and I need it to determine the ambient light of the region anyway (even though many zones seem to have the same ambient light in all regions). The traversal code looks really straightforward, thanks.
Quote:
Originally Posted by PixelEngineer
I think you are misunderstanding what skydomes really are. EverQuest's skydomes are small half spheres that translate and rotate with the camera. They are drawn first and give the impression that the sky is very large despite it being exactly the opposite. Picture someone walking around with a sky textured bowl on their head. This is essentially the idea and because it moves with them, it gives the illusion that the sky is vastly infinite. If you were to stretch the skybox over the entire zone and walk the distance, you could notice approaching the edge of the diameter and it would look a bit weird.
The first thing you should render is the skydome. Then, clear the depth buffer. Because the skydome is inches away from the camera if you didn't clear the depth buffer, none of the zone would render because it's all further away than the skydome actually is. After clearing the depth buffer, render as you usually do. It will give the illusions that behind everything rendered is a vast sky.
|
Ah, that makes sense. I think I understand now. Does this mean you render the sky box in camera space?
|
|
|
|
|
|
|
11-02-2012, 09:38 PM
|
Fire Beetle
|
|
Join Date: Jul 2010
Posts: 5
|
|
Hey guys. I was able to get VS2010 and AnkhSVN working and was able to even get the code compiled and running. It's a pretty big thrill when you're able to open gfay and have a look around in a completely separate client, compiled on your own machine.
Is there still being work done on this? PE - I sent you a PM and an email but haven't heard from you yet.
Also - I'm not very familiar with building/branching public SVN projects but I'm reading a book that highly recommends organizing any game development code into several folders: Docs, Media, Source, Obj, Bin, Test. Any thoughts on doing this early in the process (before the client gets more elaborate than it is now)?
Nice work though - I'm really excited to have this resource. Also, huge kudos for commenting and organizing your code such that someone can jump into it and understand, basically, what's going on! I have little/no experience with C++ or graphics in C but you've put together a project that makes sense to me - VERY Exciting!
|
|
|
|
11-03-2012, 11:32 PM
|
Fire Beetle
|
|
Join Date: Jul 2010
Posts: 5
|
|
So I've spent more time tinkering with the Assembla repo and VS settings and realized that I can do all of the custom folder structure stuff on my local computer by myself - Silly me! Apologies - I've never used VS before so still getting used to all of it.
I also took the time tonight to read through some of the earlier posts in depth (rather than skimming through). Any word on when you'd feel comfortable releasing the code on github? I figure the assembla code would be good to get a leg up on what's going on but would it be worth working on this code at all?
|
11-11-2012, 11:20 PM
|
Sarnak
|
|
Join Date: May 2011
Posts: 96
|
|
I am still working on this client, albeit slowly. I have something super interesting I am working on. Still classic but will definitely change the way EverQuest is viewed. I will let everyone know when I am done.
Thanks for the interest in developing. I know there is another developer around working on some things as well. It's pretty cool as he's pretty much got animations figured out and I was able to help with other stuff I had already finished.
I will update soon.
|
11-12-2012, 04:05 AM
|
Hill Giant
|
|
Join Date: Jul 2012
Location: Oklahoma
Posts: 222
|
|
Friggen TEASE !!
|
11-12-2012, 09:02 PM
|
Fire Beetle
|
|
Join Date: Jul 2010
Posts: 5
|
|
No kidding Looking forward to seeing the progress!
|
11-13-2012, 06:56 PM
|
Administrator
|
|
Join Date: Sep 2006
Posts: 1,348
|
|
Un-sticking this for now as it includes more closed source development than current open source development. Which is fine but not deserving of a sticky post in a FOSS development forum any longer.
|
01-13-2013, 03:45 PM
|
Discordant
|
|
Join Date: Mar 2009
Location: eqbrowser.com
Posts: 309
|
|
Animations/models/textures all extracted.
http://bit.ly/X5VUln
|
01-19-2013, 06:44 AM
|
Discordant
|
|
Join Date: Mar 2009
Location: eqbrowser.com
Posts: 309
|
|
Example - animations in unity
http://bit.ly/VAqjuG
|
01-19-2013, 07:03 PM
|
Fire Beetle
|
|
Join Date: Nov 2009
Posts: 4
|
|
Awesome Tyen! What's your next steps?
|
01-24-2013, 06:56 PM
|
Sarnak
|
|
Join Date: Jan 2009
Location: san jose, ca
Posts: 44
|
|
Quote:
Originally Posted by Tyen05
|
took one of the models and convert fbx to dae and imported to opensim. then i converted it to obj and import to zbrush. i am sure i could do it for blender too and it has fbx export.
so in theory, i could convert fbx to collada. import to blender. add more animations and then export back to fbx.
i dont know what to do after that for EQ.
|
01-25-2013, 12:16 AM
|
Sarnak
|
|
Join Date: Jan 2009
Location: san jose, ca
Posts: 44
|
|
well good luck. i am going back to opensim.
|
01-26-2013, 11:40 PM
|
Sarnak
|
|
Join Date: Jan 2009
Location: san jose, ca
Posts: 44
|
|
i was going to write about 3rd party sl client and opensim and EQEmu but i am not ready. right now, i can do everything i need in unity and a fake server. then i can transfer in a 3rd party sl client in c++.
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -4. The time now is 05:17 PM.
|
|
|
|
|
|
|
|
|
|
|
|
|