Members Scobalula Posted September 7 Members Posted September 7 (edited) On 9/1/2025 at 4:05 PM, bladers said: I hope you are successful but i tried for a month even using AI for help. There are just several different types of models, up to 6 or more. Lod offset and Vertex Offset are just not reliable as the data are stacked in different order depending on the type of model. So I'm interested to know how you are doing it, i tried to unpack the .exe and really didn't see anything in there that was helpful There is only 1 model format to my knowledge that is used in the game, there is another present in the game files but it seems to just be left over data from older versions of Northlight (Control, Quantum, etc.) used in the game. The format is variable length and there are tons of checks based off data presence it does (for example a particular block of data isn't present if a count in the header is not present, and then other data will not be present if another check doesn't pass based off previous data) which probably adds to the complexity of parsing it. Each individual mesh also has variable arrays within it (for example vertex attribute tables). I'm still working on this but last few weeks we've had a few big projects at work and I've taken some OOH work for OT so time is a little scarce atm. Edited September 7 by Scobalula
bladers Posted September 7 Posted September 7 9 hours ago, Scobalula said: There is only 1 model format to my knowledge that is used in the game, there is another present in the game files but it seems to just be left over data from older versions of Northlight (Control, Quantum, etc.) used in the game. The format is variable length and there are tons of checks based off data presence it does (for example a particular block of data isn't present if a count in the header is not present, and then other data will not be present if another check doesn't pass based off previous data) which probably adds to the complexity of parsing it. Each individual mesh also has variable arrays within it (for example vertex attribute tables). I'm still working on this but last few weeks we've had a few big projects at work and I've taken some OOH work for OT so time is a little scarce atm. I guess thats one way to look at it, one model format with different types. But yeah all of what you described is what i see in the model data files. That's what makes it so hard to build a script for. You end up with a model definition from the files like : mesh index lodId vertCount faceCount bytes to skip to vertex offset - distance into the UV and Vertex sections, respectively, measured within the current LOD’s section (i.e., NOT absolute file positions). These are cumulative within an LOD group except for the first mesh in that LOD. bytes to skip to vertex offset 2 - distance into Vertex sections byteForFace face_offset - count of indices to skip after the vertex data within the LOD section (in INDICES, not bytes). To convert to bytes, multiply by byteForFace. bonesPerVertex Vertex Size UV Size LOD offset (start of the mesh’s LOD block) You can even compute and Validate all “bytes to skip” and “face_offset” fields against vert/face count. Mesh Example: Mesh 1: offset: 0x353 → mesh index = 4 offset: 0x357 → lodId = 3 offset: 0x35b → vertCount = 324 offset: 0x35f → faceCount = 232 offset: 0x363 → bytes to skip to vertex offset = 0 offset: 0x367 → bytes to skip to vertex offset 2 = 0 offset: 0x36b → byteForFace = 2 offset: 0x36f → face_offset = 0 offset: 0x373 → bonesPerVertex = 1 Vertex Size = 12 UV Size = 8 Mesh 2: offset: 0x3ca → mesh index = 0 offset: 0x3ce → lodId = 2 offset: 0x3d2 → vertCount = 542 offset: 0x3d6 → faceCount = 465 offset: 0x3da → bytes to skip to vertex offset = 2592 offset: 0x3de → bytes to skip to vertex offset 2 = 3888 offset: 0x3e2 → byteForFace = 2 offset: 0x3e6 → face_offset = 696 offset: 0x3ea → bonesPerVertex = 1 Vertex Size = 12 UV Size = 8 Mesh 3: offset: 0x441 → mesh index = 0 offset: 0x445 → lodId = 1 offset: 0x449 → vertCount = 974 offset: 0x44d → faceCount = 932 offset: 0x451 → bytes to skip to vertex offset = 6928 offset: 0x455 → bytes to skip to vertex offset 2 = 10392 offset: 0x459 → byteForFace = 2 offset: 0x45d → face_offset = 2091 offset: 0x461 → bonesPerVertex = 1 Vertex Size = 12 UV Size = 8 And here for validation: ## PRE-COMPUTED MESH SIZES Mesh 1: - UV data: 324 vertices × 8 bytes = 2,592 bytes - Vertex data: 324 vertices × 12 bytes = 3,888 bytes - Face data: 232 faces × 3 indices × 2 bytes = 1,392 bytes - Total Mesh 1 size = 7,872 bytes Notice the 2,592, and 3,888 matches the skip bytes #1 and #2 for mesh 2. Mesh 2: - UV data: 542 vertices × 8 bytes = 4,336 bytes - Vertex data: 542 vertices × 12 bytes = 6,504 bytes - Face data: 465 faces × 3 indices × 2 bytes = 2,790 bytes - Total Mesh 2 size = 13,630 bytes Notice adding the UV size of both mesh 1 & mesh you (2,592 + 4,336), and vertex size of both mesh 1 & mesh 2 (3,888 + 6,504) you end up matching the skip bytes #1 and #2 for mesh 3. Mesh 3: - UV data: 974 vertices × 8 bytes = 7,792 bytes - Vertex data: 974 vertices × 12 bytes = 11,688 bytes - Face data: 932 faces × 3 indices × 2 bytes = 5,592 bytes - Total Mesh 3 size = 25,072 bytes ..similar thing to validate faceoffset. But even with all of this info and then some, because of how the data is layers, its impossible to figure out the entire thing looking at the data files itself. So how are you discovering the model format by looking at the executable, are you running a disassembler or decompiler or something? How are you doing it.
Members Scobalula Posted September 13 Members Posted September 13 On 9/7/2025 at 9:40 PM, bladers said: I guess thats one way to look at it, one model format with different types. But yeah all of what you described is what i see in the model data files. That's what makes it so hard to build a script for. You end up with a model definition from the files like : mesh index lodId vertCount faceCount bytes to skip to vertex offset - distance into the UV and Vertex sections, respectively, measured within the current LOD’s section (i.e., NOT absolute file positions). These are cumulative within an LOD group except for the first mesh in that LOD. bytes to skip to vertex offset 2 - distance into Vertex sections byteForFace face_offset - count of indices to skip after the vertex data within the LOD section (in INDICES, not bytes). To convert to bytes, multiply by byteForFace. bonesPerVertex Vertex Size UV Size LOD offset (start of the mesh’s LOD block) You can even compute and Validate all “bytes to skip” and “face_offset” fields against vert/face count. Mesh Example: Mesh 1: offset: 0x353 → mesh index = 4 offset: 0x357 → lodId = 3 offset: 0x35b → vertCount = 324 offset: 0x35f → faceCount = 232 offset: 0x363 → bytes to skip to vertex offset = 0 offset: 0x367 → bytes to skip to vertex offset 2 = 0 offset: 0x36b → byteForFace = 2 offset: 0x36f → face_offset = 0 offset: 0x373 → bonesPerVertex = 1 Vertex Size = 12 UV Size = 8 Mesh 2: offset: 0x3ca → mesh index = 0 offset: 0x3ce → lodId = 2 offset: 0x3d2 → vertCount = 542 offset: 0x3d6 → faceCount = 465 offset: 0x3da → bytes to skip to vertex offset = 2592 offset: 0x3de → bytes to skip to vertex offset 2 = 3888 offset: 0x3e2 → byteForFace = 2 offset: 0x3e6 → face_offset = 696 offset: 0x3ea → bonesPerVertex = 1 Vertex Size = 12 UV Size = 8 Mesh 3: offset: 0x441 → mesh index = 0 offset: 0x445 → lodId = 1 offset: 0x449 → vertCount = 974 offset: 0x44d → faceCount = 932 offset: 0x451 → bytes to skip to vertex offset = 6928 offset: 0x455 → bytes to skip to vertex offset 2 = 10392 offset: 0x459 → byteForFace = 2 offset: 0x45d → face_offset = 2091 offset: 0x461 → bonesPerVertex = 1 Vertex Size = 12 UV Size = 8 And here for validation: ## PRE-COMPUTED MESH SIZES Mesh 1: - UV data: 324 vertices × 8 bytes = 2,592 bytes - Vertex data: 324 vertices × 12 bytes = 3,888 bytes - Face data: 232 faces × 3 indices × 2 bytes = 1,392 bytes - Total Mesh 1 size = 7,872 bytes Notice the 2,592, and 3,888 matches the skip bytes #1 and #2 for mesh 2. Mesh 2: - UV data: 542 vertices × 8 bytes = 4,336 bytes - Vertex data: 542 vertices × 12 bytes = 6,504 bytes - Face data: 465 faces × 3 indices × 2 bytes = 2,790 bytes - Total Mesh 2 size = 13,630 bytes Notice adding the UV size of both mesh 1 & mesh you (2,592 + 4,336), and vertex size of both mesh 1 & mesh 2 (3,888 + 6,504) you end up matching the skip bytes #1 and #2 for mesh 3. Mesh 3: - UV data: 974 vertices × 8 bytes = 7,792 bytes - Vertex data: 974 vertices × 12 bytes = 11,688 bytes - Face data: 932 faces × 3 indices × 2 bytes = 5,592 bytes - Total Mesh 3 size = 25,072 bytes ..similar thing to validate faceoffset. But even with all of this info and then some, because of how the data is layers, its impossible to figure out the entire thing looking at the data files itself. So how are you discovering the model format by looking at the executable, are you running a disassembler or decompiler or something? How are you doing it. Yes indeed, x64dbg to debug the mesh loader and Ghidra/IDA for decompiling the game's exe, I have spent a lot of time running through the game's mesh resource loader to traverse the files and essentially pull out everything of value, along with code to properly parse the vertex data based off the attributes stored per mesh/lod. Along with that as shown previously I've figured out how the game maps different resources together based off IDs.
bladers Posted September 16 Posted September 16 On 9/13/2025 at 12:17 PM, Scobalula said: Yes indeed, x64dbg to debug the mesh loader and Ghidra/IDA for decompiling the game's exe, I have spent a lot of time running through the game's mesh resource loader to traverse the files and essentially pull out everything of value, along with code to properly parse the vertex data based off the attributes stored per mesh/lod. Along with that as shown previously I've figured out how the game maps different resources together based off IDs. Interesting, I used binary ninja for decompiling the game exe. Eitherway, since you have been busy with work and most people are only interested in the models. Are you willing to share what you discovered/code you were able to pull on the models in a walkthrough guide video or writeup? Others like me can then take what you discovered and write automated code for it. Most people like me don't care about the materials and more interested in the models as we already have access to the textures.
Members Scobalula Posted September 21 Members Posted September 21 On 9/16/2025 at 9:26 AM, bladers said: Interesting, I used binary ninja for decompiling the game exe. Eitherway, since you have been busy with work and most people are only interested in the models. Are you willing to share what you discovered/code you were able to pull on the models in a walkthrough guide video or writeup? Others like me can then take what you discovered and write automated code for it. Most people like me don't care about the materials and more interested in the models as we already have access to the textures. Let me see what I can do about getting a tool going that can export just the meshes for now with no references.
jackal Posted September 25 Posted September 25 (edited) Hey, has anyone even tried or had any luck with extracting models from FBC: Firebreak with the AW2 tools? I assume it still uses the same version of the engine, since it's such a small project and shares a lot of assets with Alan Wake 2. I gave it a go myself but I'm not quite getting the hang of the specifics of extracting the models, but I did notice that it seems to follow the same pattern outlined previously in the thread for Alan Wake 2 models. Edit: The game had a major update now so I gave it another try, not 100% sure but I think the data inside the files changed just like with AW2 and the updates, at least in the ones I've been inspecting. I could be wrong on that though. So far I haven't been able to import anything into Blender because I keep running into errors with struct.unpack and other function errors, but debugging the script I'm pretty sure it's just because I'm giving it the completely wrong table/lod values in the file, the face/bone counts are completely wrong.. So as it is, I really just don't understand this well enough. The location of bone data is obvious in the files because it's in plain text, but I don't understand anything beyond that. Edited October 10 by jackal Some notes after the first major update.
bladers Posted October 11 Posted October 11 On 9/21/2025 at 5:38 AM, Scobalula said: Let me see what I can do about getting a tool going that can export just the meshes for now with no references. Any update? I can help if u need any.
Contro Posted October 14 Posted October 14 On 9/21/2025 at 10:38 AM, Scobalula said: Let me see what I can do about getting a tool going that can export just the meshes for now with no references. I really hope you pull through with this. Too many people have come through, said they're making a tool for Alan Wake 2, and then just disappear with no updates.
Members Scobalula Posted October 19 Members Posted October 19 On 10/11/2025 at 8:40 PM, bladers said: Any update? I can help if u need any. On 10/14/2025 at 4:32 PM, Contro said: I really hope you pull through with this. Too many people have come through, said they're making a tool for Alan Wake 2, and then just disappear with no updates. I have released an early version of the tool that can do just meshes with their material names/skeleton: 2 2
bladers Posted Tuesday at 08:10 PM Posted Tuesday at 08:10 PM On 10/19/2025 at 4:44 AM, Scobalula said: I have released an early version of the tool that can do just meshes with their material names/skeleton: WOW YOU DID IT! Thanks. So you are able to see the code/function/flow of logic. Do you know what function is handling animation? The thing about AW2/Control is, i was able to get to where animation frame data and some infomation about the animation file like length, how many frames, etc and i think i was able to map it out with the bones as they are stoned in different files. bone is stored in the model if im remembering correctly and then the binanimclip file uses those bones. i also believe i was able to get the base transform but im not sure. But the main animation data, i couldn't figure out how its being stored. if each of the frames is bit packed and a delta of the base pose like some other games i recently done. Would have to dig into my logs. But seems like with your tools you can navigate directly to the function that loads/saves/imports/reads the animation and look at the logic flow. might solve Control/AW2 in one swoop cause im sure they are alot similar. If you do look at the function, would like a description on how its being stored. thanks again
Members Scobalula Posted Wednesday at 06:04 AM Members Posted Wednesday at 06:04 AM (edited) 9 hours ago, bladers said: WOW YOU DID IT! Thanks. So you are able to see the code/function/flow of logic. Do you know what function is handling animation? The thing about AW2/Control is, i was able to get to where animation frame data and some infomation about the animation file like length, how many frames, etc and i think i was able to map it out with the bones as they are stoned in different files. bone is stored in the model if im remembering correctly and then the binanimclip file uses those bones. i also believe i was able to get the base transform but im not sure. But the main animation data, i couldn't figure out how its being stored. if each of the frames is bit packed and a delta of the base pose like some other games i recently done. Would have to dig into my logs. But seems like with your tools you can navigate directly to the function that loads/saves/imports/reads the animation and look at the logic flow. might solve Control/AW2 in one swoop cause im sure they are alot similar. If you do look at the function, would like a description on how its being stored. thanks again For animation I eventually found out they just use this: https://github.com/nfrechette/acl In animations, they only store bone hashes, which map to hashes stored in skeleton files (I have the hash function and can post it later) I will probably open source my tool soon so that others can contribute as my time is limited with other projects and work Edited Wednesday at 06:06 AM by Scobalula
bladers Posted Friday at 04:02 AM Posted Friday at 04:02 AM (edited) On 10/22/2025 at 1:04 AM, Scobalula said: For animation I eventually found out they just use this: https://github.com/nfrechette/acl In animations, they only store bone hashes, which map to hashes stored in skeleton files (I have the hash function and can post it later) I will probably open source my tool soon so that others can contribute as my time is limited with other projects and work I believe its 32‑bit FNV‑1a hash if im not mistaken? just checked, yup this is textbook ACL (the distinctive 0xAC10AC10 in each stream header). Wow nice find. You really have you make a short video or something on your workflow how you are able to access these functions from the .exe Edited Friday at 05:24 AM by bladers
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now