Leaderboard
Popular Content
Showing content with the highest reputation since 10/21/2025 in all areas
-
4 points
-
3 points
-
3 points
-
Version 1.0.0
12 downloads
Broken Sword - Shadow of the Templars (1996) * bs1_1996_clu_export.py * bs1_1996_clu_import_and_patch.py Required: Python. When installing, make sure to check “Add python.exe to PATH.” Usage: * Copy the swordres.rif and text.clu files into the same folder as the .py files. * Run bs1_1996_clu_export.py: this will extract the texts into Text_exported.txt. * Translate it, then rename the finished file to Text_translated.txt. * Run bs1_1996_clu_import_and_patch.py: this will insert the translated texts back into text.clu and modify swordres.rif. * The new files will be created with the _new suffix. Tested with GoG (2.0.0.8) and Steam Reforged Free DLC FEARka3 points -
2 points
-
2 points
-
I am updating the Noesis script from this post to handle more versions of the 3D model rsf format. Soon I will release an update to the script. Meanwhile, as a sample, I want to show some extracted models that we were not able to extract with the old script. On the top left corner we find the Orange Bowl stadium from NCAA 08. On the top right corner we see the hologram (a ficticious one) stadium from Madden 13. The low left corner shows the LA Memorial Coliseum from Madden 12 or 13 and finally on the low right corner we find the Louisiana Tech stadium from NCAA 12. All these rsf files come from the PS3 versions of the games with data in big endian. The script can handle data in little endian too, for example rsf files coming from PS Vita games. I am almost 100% sure that the updated script will be able to handle rsf files from NCAA 08 to NCAA 14 and Madden 07 to Madden 17 with no issues.2 points
-
I have released an early version of the tool that can do just meshes with their material names/skeleton:2 points
-
Here my analysis: Header: 24 bytes: [ Int64 EntryCount Int64 ValueCount Int32 Timestamp Int32 Padding ] Buckets: [24-528] bytes, based on allocated bucket TableEntries: EntryCount * [ 8 Bytes Hash(or id?), Int32 RelativeOffset, (formula: text_start = current_entry_offset + 8 + value) Int32 TextLength ] Values: ValueCount * [ Byte[ValueLength] Data ] Null value have zero length and no hash. Successfully unpack and pack, the game load new text normally.1 point
-
I found the solution. Use the pak.py python script (i uploaded) in the same directory as the .data file This will extract the UE4 .pak file from the .data file Since the version is 4.27, use the QuickBMS 4.27 Unreal Engine Script to extract all data! Then you can use UModel to extract models/audio etc! # Unreal Engine 4 - Unreal Tournament 4 (*WindowsNoEditor.pak) (script 0.4.27e) # script for QuickBMS http://quickbms.aluigi.org math NO_TAIL_INFO = 0 # set it to 1 for archives with corrupt/missing tail information (extract without index) math VERSION = 3 # set it to 3 if NO_TAIL_INFO = 1 for most of modern games quickbmsver "0.12" callfunction QUICKBMS_4GB_CHECK 1 # set your AES_KEY here as umodel hex ("0x1122...") or C string ("\x11\x22...") # don't change AES_KEY_IS_SET, it will be handled automatically set AES_KEY binary "" math TOC_FILE = 0 math ALTERNATIVE_MODE = 0 math AES_KEY_IS_SET = 0 math BASE_PATH_INCLUDED = 1 math DIR_FLAG = 1 math NAME_FROM_ARRAY = 0 math SKIP_COUNT = 0 get ARCHIVE_NAME basename get ARCHIVE_PATH FILEPATH math CHUNK_OFFSET_ABSOLUTE = -1 # default, enabled # 1 = HIT math WORKAROUND = 0 if NO_TAIL_INFO != 0 get OFFSET asize math ALTERNATIVE_MODE = 1 else goto -0xcc # version 11 (4.26-4.27) savepos MAGIC_OFF get MAGIC long get VERSION long endian guess VERSION get OFFSET longlong get SIZE longlong getdstring HASH 20 xmath SIZE "MAGIC_OFF - OFFSET - 1" get FSIZE asize savepos CUR_POS if CUR_POS = FSIZE string COMP1 = "" else get CHECK byte if CHECK > 1 goto -1 0 SEEK_CUR endif getdstring COMP1 32 getdstring COMP2 32 string COMP1 l COMP1 string COMP2 l COMP2 endif if VERSION >= 3 goto MAGIC_OFF goto -1 0 SEEK_CUR get ENCRYPTED byte if ENCRYPTED != 0 callfunction SET_AES_KEY 1 log MEMORY_FILE5 OFFSET SIZE encryption "" "" else log MEMORY_FILE5 OFFSET SIZE endif math TOC_FILE5 = -5 endif goto 0 callfunction GET_BASE_PATH 1 endif get FILES long TOC_FILE5 getdstring DUMMY 12 TOC_FILE5 get HASHES_OFFSET longlong TOC_FILE5 math HASHES_OFFSET - OFFSET get HASHES_SIZE longlong TOC_FILE5 getdstring DUMMY 24 TOC_FILE5 get NAMES_OFFSET longlong TOC_FILE5 math NAMES_OFFSET - OFFSET get NAMES_SIZE longlong TOC_FILE5 getdstring DUMMY 24 TOC_FILE5 savepos BASE_INDEX_OFF TOC_FILE5 goto NAMES_OFFSET TOC_FILE5 math CHUNK_SIZE = 0x10000 # just in case... for i = 0 < FILES callfunction GET_NAME_AND_OFFSET 1 if NAME = "" continue NEXT0 endif savepos TMP_OFF TOC_FILE get OFFSET longlong TOC_FILE get ZSIZE longlong TOC_FILE get SIZE longlong TOC_FILE get ZIP long TOC_FILE getdstring HASH 20 TOC_FILE math CHUNKS = 0 math ENCRYPTED = 0 if VERSION >= 3 if ZIP != 0 get CHUNKS long TOC_FILE for x = 0 < CHUNKS get CHUNK_OFFSET longlong TOC_FILE get CHUNK_END_OFFSET longlong TOC_FILE putarray 0 x CHUNK_OFFSET putarray 1 x CHUNK_END_OFFSET next x endif get ENCRYPTED byte TOC_FILE get CHUNK_SIZE long TOC_FILE endif #if ALTERNATIVE_MODE != 0 savepos TMP_OFF TOC_FILE math OFFSET + TMP_OFF #endif #comtype copy callfunction COMPRESSION_TYPE 1 if CHUNKS > 0 log NAME 0 0 append math TMP_SIZE = SIZE if CHUNK_OFFSET_ABSOLUTE < 0 && OFFSET != 0 getarray CHUNK_OFFSET 0 0 if CHUNK_OFFSET u< OFFSET math CHUNK_OFFSET_ABSOLUTE = 0 else math CHUNK_OFFSET_ABSOLUTE = 1 endif endif for x = 0 < CHUNKS getarray CHUNK_OFFSET 0 x getarray CHUNK_END_OFFSET 1 x math CHUNK_ZSIZE = CHUNK_END_OFFSET math CHUNK_ZSIZE - CHUNK_OFFSET math CHUNK_XSIZE = CHUNK_ZSIZE if ENCRYPTED != 0 callfunction SET_AES_KEY 1 math CHUNK_XSIZE x 16 endif if TMP_SIZE u< CHUNK_SIZE math CHUNK_SIZE = TMP_SIZE endif math CHUNK_OFFSET = OFFSET if ZIP == 0 log NAME CHUNK_OFFSET CHUNK_SIZE 0 CHUNK_XSIZE else clog NAME CHUNK_OFFSET CHUNK_ZSIZE CHUNK_SIZE 0 CHUNK_XSIZE endif math TMP_SIZE - CHUNK_SIZE math OFFSET + CHUNK_XSIZE next x append else # the file offset points to an entry containing # the "same" OFFSET ZSIZE SIZE ZIP HASH ZERO fields, # just an additional backup... so let's skip them savepos BASE_OFF TOC_FILE math BASE_OFF - TMP_OFF math OFFSET + BASE_OFF math XSIZE = ZSIZE if ENCRYPTED != 0 callfunction SET_AES_KEY 1 math XSIZE x 16 endif if ZIP == 0 math BLOCK = 0x40000000 xmath FSIZE "OFFSET + ZSIZE" log NAME 0 0 append for OFFSET = OFFSET < FSIZE xmath DIFF "FSIZE - OFFSET" if DIFF < BLOCK math XSIZE = DIFF if ENCRYPTED != 0 math XSIZE x 16 endif log NAME OFFSET DIFF 0 XSIZE else log NAME OFFSET BLOCK endif math OFFSET + BLOCK next append else clog NAME OFFSET ZSIZE SIZE 0 XSIZE endif endif encryption "" "" if ALTERNATIVE_MODE != 0 if CHUNKS == 0 math OFFSET + XSIZE endif goto OFFSET get TMP1 longlong get CHECK byte if TMP1 == 0 && CHECK != 0 goto OFFSET continue NEXT1 else goto OFFSET endif xmath CHECK "0x800 - (OFFSET % 0x800)" if CHECK <= 16 padding 0x800 endif savepos OFFSET get TMP1 longlong get TMP2 longlong if TMP2 == 0 padding 0x800 else goto OFFSET endif label NEXT1 endif label NEXT0 next i print "\nEntries ignored: %SKIP_COUNT%" for i = 0 < SKIP_COUNT getarray NAME 7 i print "Ignored entry: %NAME%" next i startfunction SET_AES_KEY_ASK math AES_KEY_IS_SET = 1 print "The archive is encrypted, you need to provide the key" if AES_KEY == "" set KEY unknown "???" else set KEY binary AES_KEY endif if KEY == "" math AES_KEY_IS_SET = -1 set AES_KEY string "No key provided, encryption disabled" elif KEY strncmp "0x" string KEY << 2 string AES_KEY h KEY else set AES_KEY binary KEY endif print "KEY: %AES_KEY%" endfunction startfunction SET_AES_KEY if AES_KEY_IS_SET == 0 callfunction SET_AES_KEY_ASK 1 endif if AES_KEY_IS_SET > 0 encryption aes AES_KEY "" 0 32 endif endfunction startfunction GET_BASE_PATH get NAMESZ long TOC_FILE5 getdstring BASE_PATH NAMESZ TOC_FILE5 if NAMESZ != 0x0A && NAMESZ < 0xFF string BASE_PATH | "../../../" math BASE_PATH_INCLUDED = 0 endif endfunction startfunction CHECK_UNICODE if NAMESZ >= 0 getdstring RESULT NAMESZ TOC_FILE5 else math NAMESZ n NAMESZ math NAMESZ * 2 getdstring RESULT NAMESZ TOC_FILE5 set RESULT unicode RESULT endif endfunction startfunction GET_NAME_AND_OFFSET if NAME_FROM_ARRAY = 1 if CURR_NAME < DIR_FILES getarray NAME 5 CURR_NAME getarray OFFSET 6 CURR_NAME goto OFFSET math CURR_NAME + 1 if CURR_NAME = DIR_FILES math NAME_FROM_ARRAY = 0 endif endif else if DIR_FLAG = 1 get DIR_COUNT long TOC_FILE5 math DIR_FLAG = 0 endif if DIR_COUNT = 0 math DIR_FLAG = 1 callfunction GET_NAME_AND_OFFSET 1 else math DIR_COUNT - 1 get NAMESZ signed_long TOC_FILE5 callfunction CHECK_UNICODE 1 string DIR_NAME = RESULT get DIR_FILES long TOC_FILE5 if DIR_FILES = 0 callfunction GET_NAME_AND_OFFSET 1 else for y = 0 < DIR_FILES get NAMESZ signed_long TOC_FILE5 callfunction CHECK_UNICODE 1 string NAME = RESULT string NAME p "%s%s" DIR_NAME NAME if BASE_PATH_INCLUDED == 0 string NAME p "%s%s" BASE_PATH NAME endif putarray 5 y NAME get OFFSET long TOC_FILE5 savepos TMP_INDEX_OFF TOC_FILE5 if OFFSET != 0x80000000 && OFFSET != 0x7FFFFFFF xmath INDEX_OFF "BASE_INDEX_OFF + OFFSET" goto INDEX_OFF TOC_FILE5 get FLAGS long TOC_FILE5 xmath HAS_SIZE "FLAGS & 0x3F" xmath IS_64 "FLAGS >> 28" if HAS_SIZE = 0x3F get CHUNK_SIZE long TOC_FILE5 endif if IS_64 = 0xE get OFFSET long TOC_FILE5 else get OFFSET longlong TOC_FILE5 endif else putarray 7 SKIP_COUNT NAME math SKIP_COUNT + 1 string NAME = "" putarray 5 y NAME endif putarray 6 y OFFSET goto TMP_INDEX_OFF TOC_FILE5 next y math NAME_FROM_ARRAY = 1 math CURR_NAME = 0 callfunction GET_NAME_AND_OFFSET 1 endif endif endif endfunction startfunction COMPRESSION_TYPE if COMP1 = "" comtype zlib endif if ZIP = 1 && COMP1 = "zlib" comtype zlib elif ZIP = 1 && COMP1 = "zstd" comtype zstd elif ZIP = 1 && COMP1 = "oodle" comtype oodle elif ZIP = 1 && COMP1 = "lz4" comtype lz4 elif ZIP = 1 && COMP1 = "gzip" comtype gzip elif ZIP = 2 && COMP2 = "zlib" comtype zlib elif ZIP = 2 && COMP2 = "zstd" comtype zstd elif ZIP = 2 && COMP2 = "oodle" comtype oodle elif ZIP = 2 && COMP2 = "lz4" comtype lz4 elif ZIP = 2 && COMP2 = "gzip" comtype gzip elif ZIP = 3 || ZIP = 4 || ZIP = 0x10 # 3 - Faith of Danschant, 4 - Days Gone, 10 - Ashen comtype oodle if WORKAROUND == 2 comtype lz4 endif endif endfunction startfunction QUICKBMS_4GB_CHECK math TMP64 = 0x10000000 math TMP64 * 16 if TMP64 == 0 print "You must use quickbms_4gb_files.exe with this script!" cleanexit endif endfunction pak.py1 point
-
Version 1.1
470 downloads
Tools for Battlefield 6 beta. Currently supports dumping the game, export models/maps. Usage is similar to previous tools for frostbite engine. toc_bf6.exe - dump tool Change .ini file parameters: - game path - dump path - selection to dump "ebx", "res", "chunks" or "all" Then drop any .toc file onto .exe to dump assets. Or run from command line with 1 parameter - toc file name. Fb_bf6_mesh.exe - model tool Takes .MeshSet as parameter. ske_soldier_3p.ebx - main universal skeleton for soldiers. Must be in the same folder. If you need another skeleton, use its name as 2nd parameter. Or rename it to ske_soldier_3p.ebx. Tool will try to find chunks automatically. If not, it gives error message with chunk name. Map export 1. Create database Run fb_maps_bf6_db.exe tool once, it will scan whole dump for meshsets and blueprints, so later maps can be converted fast, without the need to go into whole tree of assets. This will take a few minutes. After that, 2 files will be created: bp.db & meshnames.txt, which need to stay in the same folder with EXE for main tool to work. 2. Export maps Use fb_maps_bf6.exe (main map tool) to convert maps. Drop any EBX on it, use in command line with 1 parameter, or create a batch. 3. Terrain export Main terrain data is in .TerrainStreamingTree files for each level. For some levels, these files are small, which means the actual data is in chunks. Sometimes data is in the file itself, in this case it may be big, about 50mb in size. Drop .TerrainStreamingTree on fb_terrain_bf6.exe or use command line.1 point -
Animation file from FGO arcade, uses the same engine as various Project DIVA titles but the animation files are formatted in a different way. .mot Tool: https://github.com/h-kidd/noesis-project-diva (works with FGO Arcade's model files and .mot files from Miracle Girls Festival and Project DIVA but it doesn't work with FGO Arcade's .mot files, but you can edit the source code of the tool to try to make it work with the game's .mot files) Sample file is in the attachment. mot_svt_0001.zip1 point
-
In the ..var01.st2 csv data is contained: edit: and xml data: <!-- ______________________________________________________________________________ Copyright 2004 The Collective, Inc. DISMEMBERMENT DEFINITION Character: Clone Trooper Author: Baback Elmieh Date: 01/07/2004 ______________________________________________________________________________ --> <!-- HEAD --> <DismemberablePart Name="Head" Hitpoints="25"> <!-- The Materials section is a list of materials in the original mesh that are to be turned off when the part is dismembered --> <!-- <Materials> <Material Name="headSG"/> </Materials> --> <!-- ReactionProcessing defines the chunks and particles to be spawned when a reaction dismemberment is processed for the character the definition requires a Bone from which a chunk should be spawned and the name of the chunkmesh. The ChunkMesh definition in turn can have several values set such as GravityScale and UseGinFile. GravityScale greater than 1.0 pulls a chunk down faster, UseGinFile will look for a bounding box with the same name as the chunkmesh in the damage mesh's gin file, if the bound is found, it is used instead of the default rendering bound which can help artists orientate chunks so that they land on their correct side --> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="neck_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <!-- <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="neck_g"> <Param Name="ChunkMesh" Value="head"/> <Param Name="GravityScale" Value="1.4"/> <Param Name="RandomVelocityScale" Value="0.2"/> </Param> </Chunk> --> </ReactionProcessing> <!-- The Capsules section provides a list of capsules that should affect the hitpoint of the part and should be disabled once the chunk has been dismembered --> <Capsules> <Capsule Name="Dneck_g"/> </Capsules> </DismemberablePart> <!-- LEFT SHOULDER --> <DismemberablePart Name="Left Shoulder" Hitpoints="25"> <Materials> <Material Name="Shoulder_LSG"/> </Materials> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="shoulder_L_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="shoulder_L_g"> <Param Name="ChunkMesh" Value="Shoulder_L"/> <Param Name="GravityScale" Value="1.8"/> <Param Name="RandomVelocityScale" Value="0.1"/> </Param> </Chunk> </ReactionProcessing> <Capsules> <Capsule Name="Dshoulder_L_g"/> </Capsules> </DismemberablePart> <!-- RIGHT ELBOW --> <DismemberablePart Name="Right Shoulder" Hitpoints="25"> <Materials> <Material Name="Elbow_RSG"/> </Materials> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="shoulder_R_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="shoulder_R_g"> <Param Name="ChunkMesh" Value="Elbow_R"/> <Param Name="GravityScale" Value="1.8"/> <Param Name="RandomVelocityScale" Value="0.1"/> </Param> </Chunk> </ReactionProcessing> <Capsules> <Capsule Name="Dshoulder_R_g"/> </Capsules> </DismemberablePart> </DismembermentDefinition>1 point
-
Hello, I have managed to get the game files and uploaded to AssetStudio to view them, and I found Texture2Ds and Sprites but some of the assets are missing. For an example, there are literally no audio/voice files at all. Then, I noticed AssetStudio doesnt recognize the assets inside a folder called "ondemand" and there are about 2k assets there and I think they are encrypted/compressed. Here is one of the examples of the encrypted assets: Is there a way to decrypt/decompress this type of file? I think those are the remaining assets. If anyone can help me I would really apprecaite it. 5db8fd68-da55-9c4a-c71f-84af76d61103.7z1 point
-
Yea, I'm working on BHD but mostly focused on the JO/DFX2 engine which is slightly newer and a different format. I'll post here when/if I get BHD usable.1 point
-
Yes! I´ve to create a tool to merge and split image, so i can merge them, edit and later split to insert.1 point
-
1 point
-
I did the descompressor and compressor but in c++, need to test the compressor thought, because is compressing better than the original. Test in game and show the results, if works. MACROSS_PS1_TOOL.zip1 point
-
1 point
-
Edit - just tested it and no 4 mrts is uv, you was right in saying the 4th one is the uv maps by the rule1 point
-
Hi experts. I'm trying to read the MT2 images from Indiana Jones and the Emperor's Tomb (PS2), but I'm stuck trying to piece it all together. The images in the PC version are 32-bit RGBA, so that's easy, and gives us a comparison. For the same image in the PS2 version, it seems to be broken down like this... 64 bytes - basic image data (filename, width, height, etc) width*height/2 - 4-bit pixel values (with 4bit PS2 swizzling) width*height/8 - color values This is what the pixel block looks like, as 4-bit values in grayscale: The color values in the last block, look to be something similar to RGBA5551, so even though the length of this block is width*height/8, as they are 16-bit colors, there are actually only width*height/16 colors. This is what the color block looks like when read as RGBA5551 color values: ... you can see the image in that "color block". I know it's not quite the right colors, but think it might just need some color striping applied to it, or it might not be exactly RGBA5551. As the width/height increase, so does the size of the color block, so it's not a plain palette, it's proportional to the image size. For example, for a 128x128 image, the color block is 2048 bytes. For a 256x256 image, the color block is 8192 bytes. I'm struggling to work out how to join the "pixel" block and the "color" block together so that we end up with a usable image that looks similar to the PC image. I'm assuming maybe we need to generate some in-between colors like in a DDS image, or otherwise apply some kind of "intensity" to the colors or something like that. I've never seen anything like this before - is anyone able to assist in understanding this please? I have attached a ZIP with the PC image, the PS2 image file, and PNGs for both the "pixel" block and the "color" block, so you can clearly see there is a correlation between the 2 blocks. Thanks for your help! vinehead.zip1 point
-
Anybody could share mot, tex_db.bin and a model file .bin of a character1 point
-
1 point
-
Well, use my old c++ tool, it should work now, tried to rewrote in python for training python syntax, and maybe i did something wrong. ZstdMagicExtractor.zip ZstdMagicExtractor-release version.zip1 point
-
Try this tool, made some adjust to read your file. zstd decompressor.zip1 point
-
It's not just 1 block of data, there are multiple compressed ZSTD blocks in your sample file that have to be joined together - e.g. at 0, 0x129b0, 0x31dd0, etc.. It looks as though each file is preceded by the compressed size and anotherr value, except the first block, which looks to be a compressed size of 0x129a0. You might have cut that bit off in your sample. Each block seems to decompress to 0x40000 bytes except for the last one, which is shorter. I guess the header might have some useful info.1 point
-
I've just released a new version of ImageHeat 🙂 https://github.com/bartlomiejduda/ImageHeat/releases/tag/v0.31.2 Changelog: - Added new pixel formats: APLHA4, ALPHA4_16X, ALPHA8, ALPHA8_16X, RGBA6666, RGBX6666, BGRT5551, BGRT8888, PAL8_TZAR, BGRA5551, BGRA5551_TZAR, BGRA8888_TZAR, BGRA4444_LEAPSTER - Added support for LZ4, Emergency RLE, Neversoft RLE, Tzar RLE, Leapster RLE, Reversed TGA RLE - Fixed issue with x360 swizzling - Fixed issue with PS Vita/Morton swizzling for 4-bpp images - Added support for palette values scaling (1x, 2x, 4x, 8x, 16x) - Added dropbox for palette scaling in "Palette Parameters" box - Added funding info1 point
-
Hey all, I also recently got interested in modding the original QP Shooting. I'm currently working on a command-line tool that so far allows for extracting and repacking the LAG assets, with decoding and encoding of dialogue/system files also now planned to be implemented. However, it seems after reviewing this thread that I wrongfully assumed that the graphics were red-blue-swapped A16B16G16R16 DDS surfaces rather than a special Luna/LAG image format... so that's probably another thing I need to fix up (although the assets can be modified fine with an editor that supports that DDS format once the header is written, so maybe it is just a slightly tweaked version of DDS). I'll post the GitHub link here when I polish and finish it up : )1 point
-
@ikskoks Thank you for this script, I also was able to use it successfully. I had a question about the outputted `.kiw` files, there is bytecode in each file that corresponds to in game instructions like character dialogue expressions, choices, background images and music cues. What is the best way to reverse engineer the bytecode in these files? I am not sure of the game engine that made this but are there any existing parsers or scripts for `.kiw` or similar you know of?1 point
-
Because the fmlb and sound file does exits anymore because when before Game shutdown that files are dynamic content but some files like that are available in beta versión APK and obb but no all files1 point
-
Hello, you really should put more effort in your request. People who could help don't have all the time in the world, to unpack a vfs, search for the sggr in question, which samples have the "lod problem", etc, etc. WHY not simply upload the samples in question plus a description what EXACTLY you've done so far to get uvs. Your post here is not very insightful, imho.1 point
-
import struct import os import sys from PIL import Image # Constants for offsets FILE_COUNT_OFFSET = 0x28 FILENAME_TABLE_OFFSET = 0x2C DATA_INDEX_OFFSET = 0x178C # A SINGLE, CORRECT LOOKUP TABLE FOR IMAGE DIMENSIONS BASED ON FILE SIZE. # All files are grayscale (1 byte/pixel). SIZE_TO_DIMENSIONS = { 262144: (512, 512), 524288: (1024, 512), 1048576: (1024, 1024) # Add more entries here if new sizes appear } def extract_grayscale_images(file_path, output_dir): """ Extracts all image files from a given .dat archive, assuming they are raw grayscale pixel data. """ with open(file_path, 'rb') as f: # Read the total number of files in the archive f.seek(FILE_COUNT_OFFSET) file_count = struct.unpack('<I', f.read(4))[0] # Read the filename table f.seek(FILENAME_TABLE_OFFSET) filenames = [] for _ in range(file_count): try: name_len = struct.unpack('<I', f.read(4))[0] filenames.append(f.read(name_len).decode('ascii', errors='replace')) except (struct.error, IndexError): # Stop if the file ends unexpectedly break # Move to the data index table f.seek(DATA_INDEX_OFFSET) print(f"Starting extraction of {len(filenames)} files (all as grayscale)...\n") success_count, skipped_count = 0, 0 for i, filename in enumerate(filenames): try: # Read the offset and size for the current file entry_data = f.read(8) if len(entry_data) < 8: break # Reached end of index offset, size = struct.unpack('<II', entry_data) # Check if we know the dimensions for this file size if size not in SIZE_TO_DIMENSIONS: print(f"[{i+1}/{file_count}] {filename} -> UNKNOWN SIZE ({size} B). Skipping.") skipped_count += 1 continue width, height = SIZE_TO_DIMENSIONS[size] # Read the pixel data current_pos = f.tell() # Save current position in the index f.seek(offset) pixel_data = f.read(size) f.seek(current_pos) # Return to the index # Prepare the output path, preserving directory structure safe_name = filename.replace('.dds', '.png').replace('\\', os.path.sep).strip(os.path.sep) out_path = os.path.join(output_dir, safe_name) os.makedirs(os.path.dirname(out_path), exist_ok=True) # ALWAYS convert as 'L' (grayscale) img = Image.frombytes('L', (width, height), pixel_data) img.save(out_path, 'PNG') print(f"[{i+1}/{file_count}] {filename} -> {out_path} ({width}x{height}, Grayscale)") success_count += 1 except Exception as e: print(f"[{i+1}/{file_count}] {filename} -> CRITICAL ERROR: {e}") skipped_count += 1 print(f"\nFinished! Converted: {success_count}, Skipped: {skipped_count}.") if __name__ == "__main__": if len(sys.argv) < 2: print(f"Usage: python {os.path.basename(__file__)} <file.dat>") sys.exit(1) input_file = sys.argv[1] # Create a more standard output directory name output_dir = os.path.splitext(input_file)[0] + "_extracted_images" print(f"Input file: {input_file}") print(f"Output directory: {output_dir}\n") extract_grayscale_images(input_file, output_dir) This script unpack texture. The file names are given with the .dds extension, but these are not DDS files. There are two types of files: those with [e] in the name build correctly, and those without [e] are strange. That's all I can help with. Script usage: python <scriptname>.py <path to file> e.g., python unpack.py sky.t000 - if it's in the same directory as the script.1 point
-
It seems like at 0x24 is vert count(4F8=1272), then at 0x40 is a pointer to vertices(0x70780). After that a line of zeros and then other block of vertices, maybe other facel?(I don't know why are 2 faces). But vert count is taking that line of zeros. I tried to ignore that line and then including that line but is the same result. The only differenece is that large triangle that goes to the floor. So it both cases is a mess, lol. Oh, probably after pointer to vertices, are pointers to Normals(0x44), Uvs(0x48) and vertex colors(0x4C). I don't understand, this is so werird...1 point
-
Oh man, I'm interested in this. I hack the N64 AKI Wrestling games, and wondered if there was anything for Def Jam Vendetta. Where can I find this Byte Map tool?1 point
-
1 point
-
i update my plugin : fmt_DS2_PS3_geo.py *(The uvs file must be located in the same folder as the model, either in the "MeshVolatile" subfolder or next to the model.)1 point
-
1 point
-
1 point
-
(just my old plugin that was made from example files of another theme) fmt_kn5.py *(don't support encryption)1 point
-
1 point
-
Version 0.0.2
16 downloads
An addon for Blender 4.3.0 (also tested with 4.4.3) to import and export the .msh, .bn (.bbx goes together) and .ani files for RF Online. The entity (R3E) and map (BSP) formats are import only. Import operations work with drag and drop. There is code for exporting the BSP format inside the addon code but it is deactivated due to being incomplete. It only reaches so far as actually exporting walkable map geometry (with the BSP structure also built) and baking+exporting the light maps. Unfortunately, Blender proved to not be very suitable for the task of actually being a complete map editor for RF Online, mostly due to complexity issues with the .SPT particle format and other desirable features that would be hard to implement into it, such as mob spawn areas and portals. The R3M materials are also quite hard to simulate, since the original engine rendered the same mesh multiple times for each texture layer they had. It is possible to reactivate the feature by manually uncommenting the three commented lines in the bsp.py's menu_func_export, register and unregister functions. Expect no support for this feature, as the more proper solution would be writing a proper dedicated software. Current Features: MSH (Mesh) Import: Imports .msh static meshes (Standard and MESH08 formats). Automatically attempts to find and assign textures by looking for DDS files referenced in the mesh or by searching .RFS archives in expected relative paths (../Tex/). MSH (Mesh) Export: Exports selected Blender mesh(es) to .msh format (Standard or MESH08). Handles vertex data, UVs, weights, and bone assignments. The export ignores any collection with the name "bone shapes". BN (Skeleton) Import: Imports .bn skeleton files. Reads bone hierarchy and rest poses. Automatically looks for the corresponding .bbx file (must be same name, same folder) to get the proper skeleton name. Creates Blender Armature objects. Also imports custom bone shape geometry if defined in the BN file and creates mesh objects for them, assigning them as custom shapes in Blender. BN (Skeleton) Export: Exports a selected Blender Armature to .bn format. Calculates and exports the corresponding .bbx file with skeleton name and bounding box. Exports custom bone shape geometry if assigned. ANI (Animation) Import: Imports .ani animation files. Applies animations to compatible Armatures and/or Objects based on names found in the ANI file. Creates Blender Actions. Option to target selected objects or objects within a collection matching the ANI's base name. ANI (Animation) Export: Exports Blender Actions to .ani format. Bakes complex animations (constraints, drivers, NLA) before export. Options to export the active action, actions from selected objects, actions from the active collection, or all scene actions. BSP (Map) Import: Imports .bsp map geometry. Reads associated .r3m (materials), .r3t (textures), and .ebp (entities, collision) files (must be same base name, same folder). Locates entity assets by parsing .rpk archives found in ../Entity/ relative to the BSP's directory. Instantiates map geometry, materials (replicating many R3M effects), and R3E entities. Includes an option to import and display LDR lightmaps from Lgt.r3t files. There is also an option for creating a visualization of the actual BSP structure of the map by creating boxes with the nodes' dimensions and leaves with the appropriate geometry, however this will most certainly make the Blender scene run very slow (this option is not necessary to see the actual map at all if that's what you want). R3E (Entity) Import: Imports .r3e files together with their associated .r3m and .r3t files. Also imports animations, if present. Installation: Download the repository as a .zip file. Or simply download the embed file here. In Blender, go to Edit > Preferences > Add-ons. Click Install... and select the downloaded .zip file. Enable the "RF Online importer/exporter" addon by checking the box next to it. Dependencies (only necessary if you want to manually try the BSP export option) DDS Export (.bsp): Exporting BSPs requires ImageMagick to be installed and accessible in your system's PATH. The addon uses it to convert textures to DDS format. Download from: https://imagemagick.org/script/download.php Important: During installation, ensure you check the option to "Install legacy utilities (e.g., convert)" as the addon uses the magick convert command. How to Use: Import: Find the RF Online importers under File > Import > ... (MSH, BN Skeleton, ANI, BSP, R3E). Export: Find the RF Online exporters under File > Export > ... (MSH, BN Skeleton, ANI). Operator Options: Each operator has options. Pay attention to options like: MSH Export: Mesh Format to Export (Standard/MESH08), Collection Type to Export. ANI Import: Apply to Selected Objects, Ignore Not Found Objects. ANI Export: Action(s) to Export. BN Export: Export only selected. Debug options are available for troubleshooting. If turned on, open Blender's console to see the messages. Expected File Structure & Naming Conventions The addon relies on specific file names and relative folder locations to find associated assets: BSP Import (map.bsp): Needs map.r3m, map.r3t, mapLgt.r3t (optional), map.ebp in the same folder. Needs entity RPK archives (e.g., entity.rpk, monster.rpk) located in ../Entity/ relative to the map.bsp folder. The addon parses these RPKs to find the .r3e, .r3m, .r3t, etc., files for map entities. MSH Import (mesh.msh): Will look for texture paths defined within the MSH. If not found directly, it attempts to find textures in .rfs archives located in ../Tex/ relative to the .msh file's folder. BN Import (skeleton.bn): Needs skeleton.bbx in the same folder to read the proper skeleton name and overall bounding box. Export Naming:MSH Export: Selected Objects: Uses the filename you provide in the export dialog (e.g., my_export.msh). Active Collection / All Collections: Uses the collection name as the base filename in the selected directory (e.g., exporting a collection named "Props" to D:/Exports/ results in D:/Exports/Props.msh). Any collection named "bone shapes" is ignored and not exported when present. This is done to prevent the exportation of bone shapes as new .msh files. BN Export: Similar to MSH Export (uses selected armature name or collection name). Writes both .bn and .bbx files (e.g., skeleton.bn, skeleton.bbx). ANI Export: Uses the Blender Action name as the filename in the selected directory (e.g., an action named "Walk_Cycle" exports as Walk_Cycle.ani). Current Limitations / Disclaimer: BSP Export is DISABLED: While the addon includes the code for that, the operator to export a full .bsp map (including geometry, materials, entities, and baked lightmaps) is currently disabled in this release. BSP export is extremely complex, and this feature is incomplete. Performance: Importing very large maps or exporting complex scenes may take time due to Python processing. You can see the importing progress if you've opened Blender's console before importing a map. R3M Effects: While many material effects are replicated using shader nodes, perfect 1:1 visual parity with the original D3D8 fixed-function pipeline can be challenging. MSH exporter does not export effects currently. Download Link: https://github.com/Cardboard-box-a/cbb-rf-online-addon (download the repo as a zip), or the file embed here. Bug Reports/Suggestions: [The github's Issue page might be more suitable for keeping tracking of possible issues] Overall the import part of the addon expects that you are using it to import files from a real game client, with the original folder structure. Meshes, for example, can be imported without their associated textures if the original folder structure is not present. The .MSH exporter splits meshes that have more than 65k vertices automatically which has been tested by the .msh importer itself, but actual experience in the game is welcome to be known. Uploaded in this post itself is a zip containing ImHex patterns for some of the file formats I've worked on. Hopefully this addon will prove useful for creating custom content for such an old game, or at least to satisfy the curiosity of what the game looks like behind the curtains. Patterns.zip1 point -
why don't you just connect the vertices in a 3d editor. the plugin opens the models, that's all you need. the rest you can work on yourself1 point
-
Bumping this, if anyone would be an absolute unit to solve the animations it would be greatly appreciated! 🙃1 point
-
my plugin for vfs work with your file EDIT: and i made preview plugin for *.sggr fmt_sggr.py (*.pvr its image, use pvrTexTool)1 point
-
1 point
-
The WAVE files just use XBox ADPCM (not that obscure) and you can play and convert them with Foobar + vgmstream (note: some files don't contain audio). You don't really need to do anything else.1 point
-
Maybe you should open a new thread and ask in there, not here, because this thread is for discussing the motion file.1 point
-
https://github.com/h-kidd/noesis-project-diva AFAIK this uses the same (or highly relevant - Virtua Fighter 5 based) engine as other arcade games such as Project DIVA Arcade or Fate Grand Order I guess the animation format would be relevant (and hope this be helpful for REing)1 point
-
Hello Everyone! So i'm back, i'm taking some time to explain the way i made things working. Who knows, it may give some insights? So let's begin (I'll try to update a bit this post to make a cleaner tutorial. first, i'm giving the .rar with the updated .exe and extra files to make things running the easy way, i'll update a bit later my git to create a fork of the original source code) So... let's begin! ___ ➡️The Easy Path : 1. Download the MK11PackageExtractor.rar 2. Extract the content in your MK11 folder (from steam, whatever, it's just the base : DriverLetter:\yourpathtothegame\Mortal Kombat 11\ 3. ⬇️ ➡️Drag your .xxx files out of assets folder (the ones you want) - copy them... it's better for your game can still run XD - and drop them at the same level of your MK11PackageExtractor.exe. ➡️Clic on decompress.bat and you can check the extraction in the created "output" folder with your "NameOfExtractedPackage". (It's a loop fetching your .xxx files) (careful, it only works for XXX files, though the source code implemented the feature, I don't know why PSF isn't working. But it doesn't matter because PSF are not compressed like .XXX so Ravioli Game Tools can do the job without problem) 4. In "NameOfExtractedPackage", you may find a ".upk" at its base. There, you can use Ravioli Game Extractor to check the information within and perform a clean extraction of the files. Most of the time, the sound format is .wem, so unless you use vgstream in foobar2000, you'd have to convert the file to listen to the file. ____ ➡️The Dev Path : (long term solution) - I haven't posted the things on github yet - so forget the part i'm giving my updated code, yet, i'm sharing what you need in order to perform the compiling To get the code working, get your dev tool (i'm using VS Code - it's free). ✅1. Download Mingw64 - i think latest version - to get the compiler working ✅2. IntelliSense being deprecated, use the last version at disposal (the time i'm writing, it's C / C++) ✅2. Clone the original git (The one who belongs to thethiny), or mine if you want the modifications used to make the code working ✅3. 3 steps : ↪️You'll need to add libraries in some headers, cstdint for most, fstream for a few ↪️in MK11file.h, i added the XXX hex code to focus on XXX files PFS = 0x0008u, 🆙 XXX = 0x0040u, OODLE = 0x0100u ↪️In extract.cpp, modify the algorithm so the compression flag for .xxx files can ALSO be applied while using oodle. Why? Because otherwise, .xxx files can't be checked with oodle compression, in other words : it won't work and you'll get an error. With so little documentation, i had to fetch things by myself, it's all because of an hex code for .xxx files not being taken into account, that's strange when you think of it because the original program should check the .xxx files 😄 ⏹️4. I added a quick text log to MK11file.cpp to compare file informations in ::validate_header ✅5. In Tasks.json, you'll need to help setting up the compiler path to create the file. I'm really not an expert in this part so i'm just focusing on MK11PackageExtrator.exe (or rename the file if you like) - This is the part most would prefer my task.json if you want a faster working thing. Here is maybe the most important part : "tasks": [ { "type": "shell", "label": "C/C++: g++.exe build active file", "command": "Driver:\\pathtomingw64\\msys64\\ucrt64\\bin\\g++.exe", //replace Driver:\\pathtomingw64 with your folder to mingw64 "args": [ "-g", "${workspaceFolder}/src/extract.cpp", "${workspaceFolder}/src/implementations/*.cpp", "-o", "MK11PackageExtractor.exe", "-std=c++11", "-I", "${workspaceFolder}/src/headers" ], "options": { "cwd": "${workspaceFolder}" }, "problemMatcher": [ "$gcc" ], "group": "build" }, ✅✅6. Use g++, not gcc (there are errors of compilation in gcc) ✅7. for Visual Studio Code, the part if you want to get a faster compilation : Ctrl + Shift + B while targetting main.cpp Once again, i'm a total newb in C++, i'm not even a senior developper, i just know how to thinker some stuff and that's it. 😄Oh and i forgot, the dev didn't add the required .dll for the .exe to work with .xxx, I had to fetch them but you're lucky, they're part of the .rar so you don't have to do the same i did🙂 Happy extracting! 😄 PS : i'm not english native speaker, forgive my bad way of explaining the technical stuff 🔲PS2 : Edit, because i like to manage several projects at the same time : Pushed a mod on Nexus for Mass Effect (i think i'll prefer hanging to less sites than too many, Nexus is a bit much above github), made new music extracts + edits, creating fly camera scripts... -. I still haven't pushed the code modifications on github for MK11 😅 I'll make a fork of the original code, then put all my modifications + a clean release when possible! MK11PackageExtractor.rar1 point
-
This thread is about the audio extraction tools from the legacy Dead Space trilogy (Dead Space, Dead Space 2, Dead Space 3). All of the tools were downloaded from Xentax years back, so credit to all of the original makers of the tools go to them. I just want to preserve them in a single place. I don't recall from memory any more what all of these file formats were, so I'm probably not much help with the usage. I'm just pasting links of the tools I had uploaded to my Mediafire account in 2018. However, what I do remember is that some of these tools that supposedly worked with two games were acting out a bit so I just in case had made seperate versions for each game. Dead Space 1 definitely has its own file formats and tools that don't work on Dead Space 2 and 3 and wise versa. I believe SBK unpacker works for all of the games but I'm not 100% sure. Exa unpacker was for Dead Space 2 specifically and EALayer for Dead Space 3, but I'm not sure if they could be modified so that one tool works with a single game. To be honest, if I was more knowledgeable, I'd just make one megatool with a proper UI that can open and extract from all of the games since having these billion exe files is frustrating. =============================== Universal .STR and DS2-3 BigFile formats' RickVisceral's BigViewer (Note that the BigFile extractor has its own UI and the STR file opener is included in the same folder. I'm pretty sure the STR file tool works with DS1 since I don't have any seperate tool for that file format with DS1 label). https://www.mediafire.com/file/vmgh564ita25wqz/RickVisceral110423.zip/file Dead Space (2008) .SNU to .WAV https://www.mediafire.com/file/tnisaj3elv77ajy/ds1_.snu_to_wav.rar/file XAS decode https://www.mediafire.com/file/tt61elv4u4sr0ca/ds1_xas_decode.rar/file Dead Space 2 (2011) SBK Unpacker https://www.mediafire.com/file/fdr8f6y5mpxf9gt/SBK_unpacker_DS1-3.rar/file EXA to MP3 https://www.mediafire.com/file/240allqyd7a6eck/DS2_EXA_to_mp3.zip/file Dead Space 3 (2013) EALayer (same as exa to mp3 if I recall but for Dead Space 3 only) https://www.mediafire.com/file/gg10lwpe0i6blla/ds3_ealayer.rar/file If I recall, some audio files can't be extracted for some reason, I think it was because they use console audio formats for some reason. For one, I recall that Dead Space 3 audiostreams folder was missing quite a bit of music files when I was uploading the whole soundtrack on YouTube some 6 years back so I had to resort to recording in the game. Also, the NPC chatter sounds come in multiple languages, so if you want English version, you need to pick the right one from the number stack. So don't be alarmed if you think you're missing some final audio files. Here are all of the 3D model tools: Edit: Here are all the tools linked to this forum instead of Mediafire: Gibbed Visceral viewer (DS2-3 archive unpacker, DS1-3 .STR unpacker) Visceral Viewer DS2-3 Parsed.rar Here is the uncompiled version of the Visceral viewer. I don't see the Dead Space 3 file list, so the version I originally received from Xentax, that I have attached, is much more up to date. https://github.com/gibbed/Gibbed.Visceral DS2-3 EAlayer (exa / snu to MP3) ds2 ealayer.rards3 ealayer.rar DS1 SNU to WAV ds1 towav snu.rar DS1-3 SBK unpack (DS2 version works with DS1 as I recall) ds2-3 sbk unpack.rar DS1 Xas decode (exa to wav or mp3) ds1 xas_decode.rar Credits go to all the original authors of these tools, I am merely reuploading them for the sake of preservation purposes and take credit only for that.1 point
ResHax.com: Empowering Curious Minds in the World of Reverse Engineering
Delving into the Art of Code Unraveling: ResHax.com - Your Gateway to the Thrilling World of Reverse Engineering, Where Curiosity Meets Innovation!