I've been looking at the vertex format, and the one above is definitely not correct.
I can't quite make out what they are doing in the vert data. I've managed to find the bone indices and weights, but that's about it. I've tried to read the data as 16bit floats and I can't seem to get a definite match on anything. I still need to try packed 10bit data, but either way I can't seem to make any sense of it yet.
I wouldn't mind a little help here if anyone could take a look. (Like, maybe Kesh? =)
Once we bang this out, I'll start on a 3DS Max import plugin so we can play with the models.
Took a whack at the models.
This is based on a clothing item/non-static mesh. I know the static meshes have diff format, at least for verts.
*Updated: most of the layout is done. Now need to make sense of unknowns. HW chokes on these structures due to zstring bugs.*
Requires HexWorkshop 6: But doesn't working either way
MD5 will be sufficient to detect file damage and modification. Generating MD5 collisions is pretty easy, true, but not for a particular file. Most likely if some one is trying to modify the game and they are that serious about it, they'll just modify the exe to send the hash values the server expects. This can be done by using a nonmodified copy of the file to generate the hash, but have the game engine actually load a modified file, or just have it read the precomputed hash values from from an outside file. But if you still want to try for something more secure, there's always the SHA hash functions, I don't believe I've heard of anyone that's made much progress against those.
Here's what I have so far on the engine_level/LevelGraphicsFile format. This format will not properly work in HW as it seems to cut off the strings after 10 chars and thus screws the rest of the format, and in the process sucks up a ton of cpu and mem (a 2.2kb file sucked up 1.4gb ram). There may also be additional file types that I have not yet run into.
This file seems to have links to a number of other files used for loading/rendering the level. This includes heightmaps and flora models. The flora models are followed by what looks like instancing data to spawn instances with.
#include "standard-types.hsl"
#pragma byteorder(big_endian)
#pragma maxarray(65536) // max that hw 5.1 will allow
#pragma maxstring(512)
typedef struct LGFFloraInstance
{
Coord3D Coord;
float Unknown1; // these two floats seem to always be very small, very close to 0
float Unknown2; // they could be used for rotation maybe? though that's usually done with quaternions
} LGFCoordEntry;
typedef struct LGFUnknownBBEntry // looks like it might be bounding box, but not sure
{
DWORD Start;
DWORD End;
Coord3D StartCoord;
Coord3D EndCoord;
} LGFUnknownBBEntry;
I would suggest to just make a post in this thread with your code release and update that post with each new release. This should allow us to have one central place where all available code can be easily accessed. Then start a new thread for discussion of the code if necessary. I'd like to keep this thread limited to just source code releases, and as so I'll probably be splitting this thread pretty soon.
topic: Source Code
(Publicly available source via Subversion)
in the forum: Archive
You can use anything you like. In terms of getting it committed in subversion, as long as everything required to build and run it is checked in you should be ok. Alternatively if you're not planning on having multiple contributing developers you can just post the code in a thread or anywhere else to your liking.
Memory leaks are very possible as this code was written in a bit of a hurry just to get it working. As for delete vs delete [] they are identical in terms of memory deallocation (at least in all c++ implementations I'm aware of), the only difference is that delete [] calls destructors on the array items if they exist before releasing the memory. Long story short, I have a bad tendency to forget to use the [] when dealing with strings/raw data buffers since I don't really see them as objects.
topic: Source Code
(Publicly available source via Subversion)
in the forum: Archive
To help the progress of tools development for Fable 2 we are running a Subversion server with source code of currently developed tools. It is currently a very small codebase, but we're sure it will grow over time as more progress is made into the file formats.
Check out is anonymous. If you would like to commit, please talk to us about it.
Repository URL: svn://fable2mod.com:3690
Happy Coding
P.S. Since the currently major developer of the code (myself) uses Visual Studio 2008, the code is setup to be built with visual studio 2008 solution/project files. On another note, a great Subversion plugin for Visual Studio is AnkhSVN, pretty much a must have in my opinion.
Offset just tells you where the file data section starts, it has nothing to do with the file table. You need to decompress the file CompressedFileData and use the output of that to read in the file table. The file table will then have offsets into the file data section for each file entry.
The number you're seeing is actually the first peice of data of the first entry:
1298494312 = 4D 65 73 68 = "Mesh"
I suggest getting a hex editor and trying some of these formats by hand. It helps a lot in understanding the data formats. Or use hex workshop and the structs we provide and it'll do most of the parsing for you.
Also the endian conversion code you have looks kind of odd. If I'm reading it right it is using global variables for a few things and using loops to do the endian conversion which is relatively slow, and can add up if you're going to be doing it on every little peice of file data. I'll be posting the code to the BNK extractor soon, you can see an alternate way of doing it in there.
P.S. This format is inherently not serial as it requires random access (aka using seek). It's just terminology, but I thought I'd mention it since you have Serialize and Deserialize names in there.
This post was edited on 2008-11-12, 12:00 by TodX.
Yeah, you're right. There is no good way of telling if a BNK is a BNK. I guess they decided to leave the tag off for some odd reason. Pretty much if it's extension is bnk then try reading it. And yeah, just read the BNK struct starting at the beginning of the file.
They're using zlib's deflate/inflate and 'managing' the streams themselves. Thus there are no checksums at the end of compression blocks. This means that if you use pre-setup uncompress() call it will fail since it looks for a checksum at the end of the compressed block. Also for the file table they are maintaining the same context through out all the compressed blocks, effectively treating it as one big stream that has been broken up into peices which makes sense. However, in the compressed files they start a new context for each block, treating each block as a new stream and then combining the outputs.
This currently extracts all the files with the '\'s in the paths replaced with '.'s to make finding files easier. I'll add a full directory dump option later. For now, I figured just being able to play with the files should be a good start.