Jump to content

Helldivers 2 Model Extraction Help!


Skelethor

Recommended Posts

For some context, Helldivers 2 is running on the Stingray/Bitsquid Engine, which Vermintide, Magicka, and the original Helldivers were all built in, with a pretty significant amount of modification. Being that Stingray was discontinued by Autodesk in 2018, my typical leads into proper asset extraction haven't gone very far.

 

It appears the game's art content is stored in .stream and .gpu_resource files in its '.../data/' path. Samples can be provided if necessary for those who may not have access to the game. There are also a number of files with no file type, but still appear to contain data. I believe the .gpu_resources files may direct the game to which assets need to be streamed from the .stream files, but I could be wrong.

 

".../data/game/" contains fewer, smaller .dl_bin files, but I suspect these are just for the game's settings and logic, rather than its actual 3D assets.

 

Back when Vermintide tools were being worked on, once the filetypes could be opened, a manual dictionary was created (but never completed) in order to rename individual assets into their correct, non-hash names. Something similar may have to be done for Helldivers 2.

 

If anyone is familiar with Stingray in any form, or has a good method to still access the engine (Autodesk has made that kind of hard), let me know! Hopefully we can get some people smarter than I am to get the wheels turning, as the art in this game is absolutely stellar.

Link to comment
Share on other sites

I've ripped a lot from Vermintide, and basically the only method is NinjaRipper. It sucks, I know. But I think that is pretty much the only way to do it right now. I don't even think NinjaRipper would work, seeing as Helldivers 2 uses GameGuard Anti-Cheat, which is notoriously hard to work around when it comes to injecting things into it.

I hope someone pulls a really good script out of thin air that can extract stuff - I really want some of the models, they're fantastic.

Edited by Contro
Link to comment
Share on other sites

Well here is what i got so far...

Extensionless file is more likely table with data pointers to the *.gpu_resources and *.stream files. 

*.gpu_resources has 3d data + texture data and maybe more.

*.stream files are mostly texture data.

Everything seems to be nameless so there is big chance for hash algo...

 

Pretty rough struct. Alot of unknowns... It's for extensionless files.

//------------------------------------------------
//--- 010 Editor v14.0 Binary Template
//
//      File: 
//   Authors: 
//   Version: 
//   Purpose: 
//  Category: 
// File Mask: 
//  ID Bytes: 
//   History: 
//------------------------------------------------
LittleEndian();OutputPaneClear();

struct HEADER {
    uint32 Magic;
    uint32 UnknownCount,FileCount;
    uint32 Unknown_0,Unknown_1,Unknown_2;
    uint64 Unknown_3;
    uint32 Unknown_4,Unknown_5,Unknown_6,Unknown_7;
    byte Null[24];
};

struct UNKNOWN_TABLE {
    uint32 Unknown_0,Unknown_1,Unknown_2,Unknown_3,Unknown_4,
           Unknown_5,Unknown_6,Unknown_7;
};

struct FILE_TABLE {
    uint32 Unknown_0,Unknown_1,Unknown_2,Unknown_3,MainFileOffset,
           Unknown_5,StreamFileOffset,Unknown_7,GpuResourcesFileOffset,Unknown_9,
           Unknown_10,Unknown_11,Unknown_12,Unknown_13,MainFileSize,
           StreamFileSize,GpuResourcesFileSize,Unknown_17,Unknown_18,FileNum;
};

HEADER Header;
UNKNOWN_TABLE UnknownTable[Header.UnknownCount];
FILE_TABLE FileTable[Header.FileCount];

Well i was able to pull model out of GpuResources. Stride may vary. This one is 24. All model information is in main extensionless file. Textures are mostly BC7 and some of them are BC1. Positions are float and UV's hfloat. There are also submeshes and LODs.

hd2-mdl.jpg

Edited by h3x3r
Link to comment
Share on other sites

O.K here's my lame bms to dump "at least i think" all textures from *.stream + *.gpu_resources.

######################################################
#  Helldivers 2 - Texture / Mesh / Material  Dumper  #
######################################################
get BaseFileName basename

Open FDDE "" 0
Open FDDE stream 1
Open FDDE gpu_resources 2

get Magic long
get UnknownCount long
get Files long
getdstring Dummy 0x3C
getdstring Dummy UnknownCount*32

for i = 0 < Files
	endian big
	get FileNameCRC0 long
	get FileNameCRC1 long
	endian little
	get FileTypeCRC64 longlong
	get MainFileOffset long
    get Unknown_0 long
    get StreamFileOffset long
    get Unknown_1 long
    get GpuResourcesFileOffset long
    get Unknown_2 long
    get Unknown_3 long
    get Unknown_4 long
    get Unknown_5 long
    get Unknown_6 long
    get MainFileSize long
    get StreamFileSize long
    get GpuResourcesFileSize long
    get Unknown_7 long
    get Unknown_8 long
    get FileNum long
	savepos EndTable
	
	if FileTypeCRC64 == 14790446551990181426
		if StreamFileSize != 0
			goto MainFileOffset
			getdstring Dummy 0xC0
			savepos MainFileOffset
			string Name p= "%s_textures/0x%04X%04X.dds" BaseFileName FileNameCRC0 FileNameCRC1
			append 0
			log Name MainFileOffset 148 0
			log Name StreamFileOffset StreamFileSize 1
		else
			goto MainFileOffset
			getdstring Dummy 0xC0
			savepos MainFileOffset
			string Name p= "%s_textures/0x%04X%04X.dds" BaseFileName FileNameCRC0 FileNameCRC1
			append 0
			log Name MainFileOffset 148 0
			log Name GpuResourcesFileOffset GpuResourcesFileSize 2
		endif
		
	elif FileTypeCRC64 == 16187218042980615487
		if GpuResourcesFileSize != 0 && MainFileSize != 0
			string MdlInfoName p= "%s_meshes/0x%04X%04X.meshinfo" BaseFileName FileNameCRC0 FileNameCRC1
			string MdlDataName p= "%s_meshes/0x%04X%04X.mesh" BaseFileName FileNameCRC0 FileNameCRC1
			log MdlInfoName MainFileOffset MainFileSize
			log MdlDataName GpuResourcesFileOffset GpuResourcesFileSize 2
		endif
	elif FileTypeCRC64 == 16915718763308572383
		if MainFileSize != 0
			string MatInfoName p= "%s_meshes/mat/0x%04X%04X.mat" BaseFileName FileNameCRC0 FileNameCRC1
			log MatInfoName MainFileOffset MainFileSize
		endif
	endif
	goto EndTable
next i

Texture uses original dds headers stored in main file. So it should be o.k.

Also found this:

https://help.autodesk.com/view/Stingray/ENU/?guid=__stingray_help_managing_content_compiled_resource_names_html

Meh i got hands on stingray editor...

EDiT: So i updated my script to dump textures + meshes + materials. Now it's up to you guys.

Edited by h3x3r
Link to comment
Share on other sites

- error in src\file.c line 633: fdnum_open()
Error: No such file or directory

Last script line before the error or that produced the error:
  8   Open FDDE gpu_resources 2
  coverage file 0     0%   0          212221     . offset 00000000
  coverage file 1     0%   0          11687154   . offset 00000000

 

dump stop early, gets many files, but not all.

also cannot open .mesh or .meshinfo, xps not work.

how to use this script?
 

Link to comment
Share on other sites

You must have all 3 files on the same place to unpack it. Extensionless file + *.stream file + *.gpu_resources file. Also use 64 bit qbms.

This script only extract textures + meshes with info file + materials.

There is no conversion of meshes at all. Meshes are pain in ass. I am working on it but don't expect it will be any soon... Not easy format. You must parse and understand *.meshinfo file and based on that you can export *.mesh file.

Edited by h3x3r
Link to comment
Share on other sites

O.K so here are my findings to the meshinfo file. It's not even fun anymore. You can continue with it if you want. I am done with it...

//------------------------------------------------
//--- 010 Editor v14.0 Binary Template
//
//      File: 
//   Authors: 
//   Version: 
//   Purpose: 
//  Category: 
// File Mask: 
//  ID Bytes: 
//   History: 
//------------------------------------------------
OutputPaneClear();LittleEndian();

local uint32 i,j,k,l,m,n,o;

struct HEADER {
    byte Null[40];
    uint32 Unknown_0,Unknown_1,Unknown_2,Unknown_3,
    Unknown_4,Unknown_5,Unknown_6[4],Unknown_7,
    Unknown_8,Unknown_9,LODInfoOffset,Unknown_11,
    MeshMainInfoOffset,Unknown_13,Unknown_14,Unknown_Offset;
}Header;

FSeek(Header.LODInfoOffset);
struct LODTABLE {
    uint32 LODCount;
    for (i=0; i < LODCount; i++){
    struct LODOFFSET {
        uint32 LODInfoOffset;
    }LODOffset;
    }
    for (i=0; i < LODCount; i++){
    FSeek(startof(LODTable)+LODOffset[i].LODInfoOffset);
    struct LODINFO {
        byte Dummy0[336];
        uint32 Unknown_1,Unknown_2,Unknown_3,Unknown_4,Unknown_5,Stride,Unknown_7,Unknown_8,Unknown_9,
        Unknown_10,Unknown_11,Unknown_12,Unknown_13,Unknown_14,Unknown_15,Unknown_16,Unknown_17,Unknown_18,
        Unknown_19,Unknown_20;
        uint32 VtxBlockOffset,VtxBlockSize,IdxBlockOffset,IdxBlockSize;
    }LODInfo;    
    }    
}LODTable;

FSeek(Header.MeshMainInfoOffset);
struct MESHMAININFO {
    uint32 MeshCount;
    for (i=0; i < MeshCount; i++){
    struct MeshInfoOffsetTable{
        uint32 MeshInfoOffset;
    }MeshOffsetTable;    
    }
    uint32 MeshCRC[MeshCount];
    uint32 Null;
}MeshMainInfo;

struct MESHINFOTABLE {
for (j=0; j < MeshMainInfo.MeshCount; j++){
FSeek(startof(MeshMainInfo.MeshOffsetTable[0])+MeshMainInfo.MeshOffsetTable[j].MeshInfoOffset);
    struct MESHINFO {
        byte Dummy_0[32];
        uint32 LODId,Unknown_1,Unknown_2,Unknown_3,Unknown_4,Unknown_5;
        byte Dummy_1[44];
        uint32 Unknown_6,Unknown_7,Unknown_8,Unknown_9,Unknown_10,Unknown_11,
        Unknown_12,Unknown_13,VtxOffset,VtxCount,IdxOffset,IdxCount;
        }MeshInfo;
    }
}MeshInfoTable;

FSeek(Header.Unknown_Offset);
struct UNKNOWNINFO {
    uint32 UnkCount;
    for (i=0; i < UnkCount; i++){
    struct UNKNOWNTABLE{
        uint32 Unknown_0,Unknown_1,Unknown_2;
    }UnknownTable;    
    }
}UnknownInfo;

 

Link to comment
Share on other sites

16 hours ago, h3x3r said:

You must have all 3 files on the same place to unpack it. Extensionless file + *.stream file + *.gpu_resources file. Also use 64 bit qbms.

This script only extract textures + meshes with info file + materials.

There is no conversion of meshes at all. Meshes are pain in ass. I am working on it but don't expect it will be any soon... Not easy format. You must parse and understand *.meshinfo file and based on that you can export *.mesh file.

Here there, your data extraction script seems to be having an issue, like with the guy above, but I am extracting directly from the game's files, which should have every file there just as required, and yes, I am using the 64 bit QBMS version.

```

- 0 files (282 logs) found in 1 seconds
  coverage file 0    16%   399744     2459520    . offset 0000000000004df8
- open input file E:\SteamLibrary\steamapps\common\Helldivers 2\data\02582f3da1f8daf5
- open script F:\Game Tools\quickbms\HD2_BMS.bms
- set output folder F:\Game Assets\Helldivers 2\Data Extraction

  offset           filesize   filename
--------------------------------------
- enter in folder E:\SteamLibrary\steamapps\common\Helldivers 2\data
  coverage file 0     0%   0          212221     . offset 0000000000000000
- open input file E:\SteamLibrary\steamapps\common\Helldivers 2\data\02582f3da1f8daf5.
- enter in folder E:\SteamLibrary\steamapps\common\Helldivers 2\data
  coverage file 1    99%   59509272   59509344   . offset 0000000000000000
- open input file E:\SteamLibrary\steamapps\common\Helldivers 2\data\02582f3da1f8daf5.stream
- enter in folder E:\SteamLibrary\steamapps\common\Helldivers 2\data
  coverage file 2    85%   14532698   17019792   . offset 0000000000000000
- open input file E:\SteamLibrary\steamapps\common\Helldivers 2\data\02582f3da1f8daf5.gpu_resources

- error in src\file.c line 633: fdnum_open()
Error: No such file or directory

Last script line before the error or that produced the error:
  8   Open FDDE gpu_resources 2
  coverage file 0     0%   0          212221     . offset 0000000000000000
  coverage file 1     0%   0          11687154   . offset 0000000000000000

Press ENTER or close the window to quit
```

Link to comment
Share on other sites

This should have the settings.ini and the bundle_database file in it, which the bundle database file includes what seems like every data pointer file to its corresponding stream file, which should then correspond to its GPU Resource file, and if their isn't one, then that one should be skipped over.

data.7z

Link to comment
Share on other sites

13 hours ago, h3x3r said:

Well being next gen title this texture kinda sucks... Only 2048x2048

ship.jpg

 

Hello. Sorry to bother you all. I dont have the knowledge to extract or view the models and i really dont understand what you are all saying in this conversation, but the only thing i need is the model of the SS Liberty you have there. I dont even want the model if thats a hassle.
Would it be possible to show me some pictures of the model from all angles? I wanna 3D Design it, cut it into pieces and 3D Print it. If not, i understand, i will keep looking for a method to view the models for the time being.

Link to comment
Share on other sites

6 hours ago, h3x3r said:

I don't know guys... Whenever i pick anything from those 3 files it unpack just fine.

Do you use latest qbms? Otherwise try to unpack it in path without spaces. Like c:\folder

Are you checking an unpack from the entire game's files as they natively are in the directory? Ala putting an * in the explorer screen to select what you want to use the QBMS script with and letting rip, their are IDs with one file or another file missing, which if they are missing, they should just be skipped over in the script, but I would be unsure how to add that, which is why I am letting you know.
I am unsure why certain files aren't needed, probably for content not ready for the game yet, but if you could just add a part to the script to skip over IDs in the files without all 3 files, that could make going through the ones that do a lot easier.

Edited by GameBreaker
Link to comment
Share on other sites

So, Let me get this right. 

This will only export meshes, textures, and materials. It won't export armature/rigging/skeleton (because I'm like 90% obj doesn't support it).

  1. For Prep:
    1. I download the tools Hex2Obj, QuickBMS, and 010 Editor (Or Whatever Text Editor).
    2. I put the Tex/Mesh/Mat Dumper script into a .txt file. (Because it is used as a filter with QuickBMS.)
    3. I take the Binary Editor script and save it with the .bt extension because it will be used by 010 Editor.
  2. I open QuickBMS (preferably in x64 format if possible).
    1. I follow what the console says to do: Aka, I first select the txt file that is the Tex/Mesh/Mat Dumper.
    2. I select input files. Which is the extensionless, the gpu_resources and stream extension files. (For this example, I'll use 000d250a449ec1e8)
    3. I select output.
    4. It dumps the files perfectly with dds textures (which can be converted or used as is), mesh files, and mesh info files.
    5. I preferably copy-paste what the console says in a text file for emergencies.
  3. I open 010 Editor and Hex2Obj.
    1. I go to 010 Editor and open a .meshinfo. (For this example, I'll do 0x4A2D0C5CD862E9EA.meshinfo because it's corresponding .mesh file is the largest at 38mb, but that doesn't really matter)
    2. I go to templates, run templates, and select the .bt I made in step 1c. It parses the data into, Header, LODTable, MeshMainInfo, MeshInfoTable and UnknownInfo.

And then I get scared looking at Hex2Obj because the application is spooky. I'll have to go through the Tut right now but it seems Hex2Obj wants me to go straight into the .mesh file and start looking for patterns and calculating the face index count.

What I'm currently trying to figure out is what information goes where to what. My assumptions are:

  1. That we are exporting each of the submeshes individually and then putting the objs together in Blender or Autodesk.
    • The VALUE in MeshMainInfo/MeshCount is the total Submeshes.
    • The VALUE in MeshMainInfo/MeshOffsetTable is important somehow. (For fun let's say, MeshOffsetTable[0] is 396)
    • The VALUES in MeshInfoTable/MeshInfo[#] are important, specifically: (Let's just use [0] again for fun.)
      1. VtxOffset (89564 or DC 5D 01 00)
      2. VtxCount (3185 or 71 0C 00 00)
      3. IdxOffset (208482 or 62 2E 03 00)
      4. IfxCount (5184 or 40 14 00 00)
  2. I still have to parse through the .mesh file and find the face indices, Vertex Block, and vertices unless some of the values above are those.
Link to comment
Share on other sites

26 minutes ago, Hampter49 said:

So, Let me get this right. 

This will only export meshes, textures, and materials. It won't export armature/rigging/skeleton (because I'm like 90% obj doesn't support it).

  1. For Prep:
    1. I download the tools Hex2Obj, QuickBMS, and 010 Editor (Or Whatever Text Editor).
    2. I put the Tex/Mesh/Mat Dumper script into a .txt file. (Because it is used as a filter with QuickBMS.)
    3. I take the Binary Editor script and save it with the .bt extension because it will be used by 010 Editor.
  2. I open QuickBMS (preferably in x64 format if possible).
    1. I follow what the console says to do: Aka, I first select the txt file that is the Tex/Mesh/Mat Dumper.
    2. I select input files. Which is the extensionless, the gpu_resources and stream extension files. (For this example, I'll use 000d250a449ec1e8)
    3. I select output.
    4. It dumps the files perfectly with dds textures (which can be converted or used as is), mesh files, and mesh info files.
    5. I preferably copy-paste what the console says in a text file for emergencies.
  3. I open 010 Editor and Hex2Obj.
    1. I go to 010 Editor and open a .meshinfo. (For this example, I'll do 0x4A2D0C5CD862E9EA.meshinfo because it's corresponding .mesh file is the largest at 38mb, but that doesn't really matter)
    2. I go to templates, run templates, and select the .bt I made in step 1c. It parses the data into, Header, LODTable, MeshMainInfo, MeshInfoTable and UnknownInfo.

And then I get scared looking at Hex2Obj because the application is spooky. I'll have to go through the Tut right now but it seems Hex2Obj wants me to go straight into the .mesh file and start looking for patterns and calculating the face index count.

What I'm currently trying to figure out is what information goes where to what. My assumptions are:

  1. That we are exporting each of the submeshes individually and then putting the objs together in Blender or Autodesk.
    • The VALUE in MeshMainInfo/MeshCount is the total Submeshes.
    • The VALUE in MeshMainInfo/MeshOffsetTable is important somehow. (For fun let's say, MeshOffsetTable[0] is 396)
    • The VALUES in MeshInfoTable/MeshInfo[#] are important, specifically: (Let's just use [0] again for fun.)
      1. VtxOffset (89564 or DC 5D 01 00)
      2. VtxCount (3185 or 71 0C 00 00)
      3. IdxOffset (208482 or 62 2E 03 00)
      4. IfxCount (5184 or 40 14 00 00)
  2. I still have to parse through the .mesh file and find the face indices, Vertex Block, and vertices unless some of the values above are those.

It seems a bit of a pain doing it manually for multiple meshes.  I haven't seen a script yet, but it should be possible if it's not the most complicated format.  Maybe upload a few samples of meshinfo and corresponding mesh files and it might be possible to do a script.

 

Link to comment
Share on other sites

10 minutes ago, DKDave said:

It seems a bit of a pain doing it manually for multiple meshes.  I haven't seen a script yet, but it should be possible if it's not the most complicated format.  Maybe upload a few samples of meshinfo and corresponding mesh files and it might be possible to do a script.

 

Best thing would be to automate the entirety of this. I can't provide game files for obvious reasons.

Link to comment
Share on other sites

8 hours ago, GameBreaker said:

Are you checking an unpack from the entire game's files as they natively are in the directory? Ala putting an * in the explorer screen to select what you want to use the QBMS script with and letting rip, their are IDs with one file or another file missing, which if they are missing, they should just be skipped over in the script, but I would be unsure how to add that, which is why I am letting you know.
I am unsure why certain files aren't needed, probably for content not ready for the game yet, but if you could just add a part to the script to skip over IDs in the files without all 3 files, that could make going through the ones that do a lot easier.

this is main issue. when trying to extract from main folder, there don't seem to be check to properly parse *everything* in its normal place, only once you move out of game folder and sort. seems backwards, would be much easier to have consistency with script by designing it to work in place

Link to comment
Share on other sites

1 hour ago, DKDave said:

It seems a bit of a pain doing it manually for multiple meshes.  I haven't seen a script yet, but it should be possible if it's not the most complicated format.  Maybe upload a few samples of meshinfo and corresponding mesh files and it might be possible to do a script.

 

It's fine to put up a few samples for analysis.  Otherwise nobody else might look at it.

Link to comment
Share on other sites

To share from Unordinal, credit to him, his work for Vermintide 2 which works with some of the same file types in an older engine version, a playlist on how the Stingray Engine code works, and his work on how the VT2 bundle file works.

https://github.com/Unordinal/VT2Lib

---------------

# VT2 Bundle File Format <!-- omit from toc -->

> Accurate as of VT2 update `2023-10-19`.


 

The below info applies to bundles and the related `.patch_###` patch bundles. `.stream` files (henceforth refered to as asset streams) hold raw data and are referenced by their similarly-named bundle files. The game streams in the data from asset streams dynamically when needed, and they usually contain data such as sound files.

 

Any code in this document is pseudo-C#, with value sizes indicated in parentheses. When a value is refered to as a hash value, it is specifically a Murmur64A 64-bit hash value unless otherwise specified. VT2 occasionally uses a 32-bit version of this which is just a 64-bit hash bit-shifted 32 bits to the right and then cast to a `uint` (`u32`).

 

## Table of Contents <!-- omit from toc -->

- [Compressed Bundle Format](#compressed-bundle-format)

  - [Known Bundle Versions](#known-bundle-versions)

  - [Compressed Bundle File Psuedo-Structure](#compressed-bundle-file-psuedo-structure)

- [Uncompressed Bundle Format](#uncompressed-bundle-format)

  - [Bundle Header](#bundle-header)

  - [Reading Resource Metadata](#reading-resource-metadata)

    - [Known Resource Flag Types](#known-resource-flag-types)

  - [Reading Resources](#reading-resources)

 

## Compressed Bundle Format

A compressed VT2 bundle consists of a header that is a version (`u32`) and the bundle's uncompressed size (`u64`). This is followed by a number of compressed chunks. Each chunk is prefixed with the compressed size (`u32`) of the chunk. A value of `65536` (`0x10000`, 64KiB) for the chunk size indicates that the chunk is uncompressed. Each chunk, when decompressed, is exactly `65536` bytes long, with zero-byte padding at the end of the decompressed file to extend the uncompressed size of the last chunk to `65536` bytes.

 

> **Note:** The bundle compressor chunks the file up into `65536` byte segments and then compresses them. If the size of the compressed chunk is bigger than the uncompressed one then the chunk is written into the file uncompressed.

 

The compression used for bundles is currently [Zstandard](https://en.wikipedia.org/wiki/Zstd) (`Zstd`). Before the `2023-10-19` update, bundles used [Zlib](https://en.wikipedia.org/wiki/Zlib) compression instead. The `Zstd` compression used requires a compression dictionary to decompress; this can be found in a file named `compression.dictionary`, located in the same folder as the bundle files.

 

To read a bundle file, simply read the header and then read the length-prefixed chunks until end of stream. Decompress each chunk by using an appropriate compression library. The magic header sizeo of both `Zlib` and `Zstd` compression formats is `u16`.

 

### Known Bundle Versions

 

| Version               | Value        | Notes

|-----------------------|--------------|-

| Vermintide 1          | `0xF0000004` |

| Vermintide 2          | `0xF0000005` | Resource update flags added.

| Vermintide 2.X        | `0xF0000006` | Resource total size for each resource included in the bundle's resource meta list.

| Vermintide 2.XZstd    | `0xF0000007` | `Zstd` compression introduced for the chunks within compressed bundles, replacing the previous `Zlib` compression.

 

### Compressed Bundle File Psuedo-Structure

```csharp

uint version;

ulong uncompressedSize;

 

foreach (chunk in file) // Read chunks until EOS

{

    uint chunkSize;

    byte[] chunkData[chunkSize]; // Contains compression header. Uncompressed data if chunkSize == 0x10000

}

```

 

The `Zlib` magic header for bundle versions < VT2XZstd is `0x789C` (`u16`).

The `Zstd` magic header for bundle versions >= VT2XZstd is `0x28B5` (`u16`).

 

## Uncompressed Bundle Format

All bundles in VT2 are compressed unless manually decompressed. This is the format of the bundles after decompression.

 

The initial data is the same as a compressed bundle, with a version and uncompressed bundle size. Then comes the actual data: a resource count (`u32`) followed by 32 hash values (`u64`) representing IDString64 properties for the bundle. Unsure what exactly these do, but it may have something to do with file-level resource overriding. The ones in the current game version are language prefixes such as `.fr` and `.es`, so this seems plausible. Unsure how they're used.

 

### Bundle Header

```csharp

uint version;

ulong fileSize;

uint entryCount;

ulong[] properties[32]; // 0 = ignore

```

 

After that, an array of metadata about each resource. The array has a length equal to the previously-written `entryCount`. Each ResourceMeta structure consists of two hash values (`u64`) which represent the resource type and the resource name, followed by the resource update flag which is a value (`u32`) indicating the resource's existence and if it was updated in some way. This value mostly is here to support "overriding" files in previous bundles with patch bundles, but it can also indicate the actual location of the resource's data. After the update type is the total resource size, which is a value (`u32`) that is the size of all of the file variants in this resource combined.

 

### Reading Resource Metadata

```csharp

for (int i = 0; i < resourceCount; i++)

{

    ulong typeHash;

    ulong nameHash;

    uint updateFlag; // Only in bundle versions >= VT2

    uint totalSize; // Only in bundle versions >= VT2X

}

```

 

#### Known Resource Flag Types

These are the known update flags for resources. They may not be completely correct as to their purpose, but this is what I believe them to be so far.

| Flag Type     | Value  | Notes

|---------------|--------|-

| None          | `0x00` | Added to the bundle.

| Deleted       | `0x01` | Deleted from the bundle.

| Moved         | `0x02` | Moved to a different path.

| CommonPackage | `0x03` | Resource is located in the common package.

 

After the resource metadata structures comes another array, with the same length (the resource count) as the previous one. This array holds the actual resource structures: two hashes (`u64`) that represent the type and name, a value (`u32`) indicating the number of variants and a value (`u32`) representing an offset into the bundle file's asset `.stream` file where any stream data this resource has can be found.

 

Now comes two arrays, both with a length equal to the variant count that was read earlier.

 

The first array contains meta about the resource variant: a value (`u32`) indicating the language of the variant (note: this may have something to do with the bundle properties values), a value (`u32`) representing the size of the variant data within the bundle file, and a value (`u32`) representing the size of the data within the bundle file's asset `.stream` data. (TODO: ensure that this and the stream offset are correct.)

 

The second array simply contains the raw data of each variant, with each chunk of data having a size equal to the variant size that was read earlier.

 

### Reading Resources

 

```csharp

for (int i = 0; i < resourceCount; i++)

{

    // These two hashes should equal their respective resource metadata hashes.

    ulong typeHash;

    ulong nameHash;

    uint variantCount;

    uint streamOffset;

 

    for (int j = 0; j < variantCount; j++)

    {

        uint variantLanguage;

        uint variantSize;

        uint variantStreamSize;

    }

 

    for (int j = 0; j < variantCount; j++)

        byte[] variantData[variantSize];

}

```

Edited by GameBreaker
Link to comment
Share on other sites

On 2/21/2024 at 4:41 PM, Hampter49 said:

So, Let me get this right. 

This will only export meshes, textures, and materials. It won't export armature/rigging/skeleton (because I'm like 90% obj doesn't support it).

  1. For Prep:
    1. I download the tools Hex2Obj, QuickBMS, and 010 Editor (Or Whatever Text Editor).
    2. I put the Tex/Mesh/Mat Dumper script into a .txt file. (Because it is used as a filter with QuickBMS.)
    3. I take the Binary Editor script and save it with the .bt extension because it will be used by 010 Editor.
  2. I open QuickBMS (preferably in x64 format if possible).
    1. I follow what the console says to do: Aka, I first select the txt file that is the Tex/Mesh/Mat Dumper.
    2. I select input files. Which is the extensionless, the gpu_resources and stream extension files. (For this example, I'll use 000d250a449ec1e8)
    3. I select output.
    4. It dumps the files perfectly with dds textures (which can be converted or used as is), mesh files, and mesh info files.
    5. I preferably copy-paste what the console says in a text file for emergencies.
  3. I open 010 Editor and Hex2Obj.
    1. I go to 010 Editor and open a .meshinfo. (For this example, I'll do 0x4A2D0C5CD862E9EA.meshinfo because it's corresponding .mesh file is the largest at 38mb, but that doesn't really matter)
    2. I go to templates, run templates, and select the .bt I made in step 1c. It parses the data into, Header, LODTable, MeshMainInfo, MeshInfoTable and UnknownInfo.

And then I get scared looking at Hex2Obj because the application is spooky. I'll have to go through the Tut right now but it seems Hex2Obj wants me to go straight into the .mesh file and start looking for patterns and calculating the face index count.

What I'm currently trying to figure out is what information goes where to what. My assumptions are:

  1. That we are exporting each of the submeshes individually and then putting the objs together in Blender or Autodesk.
    • The VALUE in MeshMainInfo/MeshCount is the total Submeshes.
    • The VALUE in MeshMainInfo/MeshOffsetTable is important somehow. (For fun let's say, MeshOffsetTable[0] is 396)
    • The VALUES in MeshInfoTable/MeshInfo[#] are important, specifically: (Let's just use [0] again for fun.)
      1. VtxOffset (89564 or DC 5D 01 00)
      2. VtxCount (3185 or 71 0C 00 00)
      3. IdxOffset (208482 or 62 2E 03 00)
      4. IfxCount (5184 or 40 14 00 00)
  2. I still have to parse through the .mesh file and find the face indices, Vertex Block, and vertices unless some of the values above are those.

 

Any progress on this? I'm also bumbling through the Hex2Obj part (this is my first time doing anything like this) and doing my best to understand what it's actually doing. I've managed to get QuickBMS / 010 to do their thing, I just need to wrangle with Hex2Obj...

 

edit: So I've read through the tut and have been poking through the 010 readout for 0x4A2D0C5CD862E9EA.meshinfo, same as you. Curious to see how this develops. In the tutorial document, the author starts his analysis with identifying the face indices, "The face indices are very easy to find since parts of the indices block look like a scrambled alphabet." That said, in 0x4A2D0C5CD862E9EA.meshinfo (just to keep in line with your writeup from above), there's many sections that could fit this criteria. My untrained eye is not picking it up xD

Edited by indy_
Link to comment
Share on other sites

Hey all, new here. Signed up just to reply after seeing other people attempting to do the same, and I don't know if someone has made more progress than I did. Anyway, I've been working on a small tool that can take .meshinfo and .mesh files, and turn it into .obj models. I don't have all information, but I've made significant progress compared to 5 hours ago. To the right is an example export.

spacer.pngObviously it's not quite right yet, but here's what I figured out:

  • LOD Info entries are slightly different from what's already been shown above:
    • +0x160 is a 32bit unsigned integer of the total number of vertices.
    • +0x188 is a 32bit unsigned integer of the total number of indices.
    • The maximum number of vertices and indices is 65535, as the stride for indices is only 2 bytes (16bit unsigned integer).
  • Mesh Info entries:
    • +0x20 appears to be a bitfield, not the LOD id. I have no clear idea how to map Mesh to LOD yet.
    • +0x84 (vertex offset) is relative to the matching LOD. Which I have no idea how to map properly yet.
    • +08C (index offset) is relative to the matching LOD.
  • There are three different vertex strides
    • 20 bytes, which appears to be [X, Y, Z], [U, V]
    • 28 bytes. Not sure what this adds yet.
    • 36 bytes. May be adding surface normals, but can't extract these yet without figuring out LOD assignment.
  • Meshes are incredibly small, most of them aren't even bigger than a single world unit.

I don't yet know how to decode indices back into faces. There's probably one or more structures still hiding in meshinfo files that may help me figure that out. Or I might have missed something incredibly simple.


 

Edit: Woops, forgot to simply increase the face indices by 1 for .obj export. Now it works.

spacer.png

Edit2: Some more things

  • I don't quite understand how the calculation for idxBlockSize works. It's in bytes, and seemingly always double of the indice number.
  • The .mesh file has triple the number of models in it than the .meshinfo suggests. What are the other two models?
  • It appears as if the vertex data uses half floats (16 byte IEEE) for some data. Texture data seems to be the prime candidate.
    • There's a few bytes I have not decoded yet.
    • Haven't gotten Blender to even consider importing texture uvs either, no idea what's going on there.

 

Edit3: It appears as if the conversion to .mesh causes data to be triplicated, resulting in three exactly identical copies of data following another. Since none of the offsets point beyond the end of the first section, I think I can ignore the duplicates.


Update 4: I have figured out how to map a mesh to a LOD, see screenshot below.

spacer.png

I've managed to generate .obj models from the 28 stride, while being completely unsuccessful with the 20 and 36 stride ones. These seem to be in an even more compressed format, and I have no idea what it is. No matter what, they end up weirdly distorted. Plus there's still a few unknown fields in the 28 stride one that seemingly never change.

To match the occasion, I have adjusted the 010 template:

//------------------------------------------------
//--- 010 Editor v14.0 Binary Template
//
//      File: 
//   Authors: 
//   Version: 
//   Purpose: 
//  Category: 
// File Mask: 
//  ID Bytes: 
//   History: 
//------------------------------------------------
OutputPaneClear();LittleEndian();

local uint32 i,j,k,l,m,n,o;

struct HEADER {
    byte Null[40];
    uint32 Unknown_0,Unknown_1,Unknown_2,Unknown_3,
    Unknown_4,Unknown_5,Unknown_6[4],Unknown_7,
    Unknown_8,Unknown_9,LODInfoOffset,Unknown_11,
    MeshMainInfoOffset,Unknown_13,Unknown_14,HashTableOffset;
}Header;

FSeek(Header.LODInfoOffset);
struct LODTABLE {
    uint32 LODCount;
    uint32 Offsets[LODCount];
    struct {
        for (i=0; i < LODCount; i++){
            FSeek(startof(LODTable)+Offsets[i]);
            struct LODINFO {
                uint32 __unk00[17];
                uint32 __unique00;
                uint32 __unk03[70];
                uint32 VtxCount, VtxStride;
                uint32 __unk01[8];
                uint32 IdxCount;
                uint32 __unk02[5];
                uint32 VtxBlockOffset,VtxBlockSize,IdxBlockOffset,IdxBlockSize;
            }LODInfo;    
        }    
    } LODs;
}LODTable;

FSeek(Header.MeshMainInfoOffset);
struct MESHMAININFO {
    uint32 MeshCount;
    uint32 Offsets[MeshCount];
    uint32 MeshCRC[MeshCount];
    uint32 Null;
    struct {
    for (j=0; j < MeshMainInfo.MeshCount; j++){
        FSeek(startof(MeshMainInfo.Offsets[0])+MeshMainInfo.Offsets[j]);
        struct MESHINFO {
            uint32 __unk00[8];
            uint32 flags;
            uint32 __unk01[22];
            uint32 lod_hash;
            uint32 __unk02;
            uint32 VtxOffset,VtxCount,IdxOffset,IdxCount;
        }Mesh;
    };
    } Meshes;
}MeshMainInfo;

FSeek(Header.HashTableOffset);
struct HashTable {
    uint32 Count;
    uint32 LODHashes[Count];
    uint32 __unk00[Count];
    uint32 __unk01[Count];
}HashInfo;

 

 

Edited by Xaymar
Got it to work. Found more things. Updated 010 script for successful LOD export.
Link to comment
Share on other sites

There are also dword indices. But didn't find any pointer to them. So the easy way could be:

If idx count isn't bigger than 65535 then int16.

If idx count bigger than 65535 then int32.

Also vtx offset is > vtx count * stride

And idx offset is > idx count * 2

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...