Leaderboard
Popular Content
Showing content with the highest reputation since 10/26/2025 in all areas
-
5 points
-
4 points
-
3 points
-
Version 1.0.0
17 downloads
Broken Sword - Shadow of the Templars (1996) * bs1_1996_clu_export.py * bs1_1996_clu_import_and_patch.py Required: Python. When installing, make sure to check “Add python.exe to PATH.” Usage: * Copy the swordres.rif and text.clu files into the same folder as the .py files. * Run bs1_1996_clu_export.py: this will extract the texts into Text_exported.txt. * Translate it, then rename the finished file to Text_translated.txt. * Run bs1_1996_clu_import_and_patch.py: this will insert the translated texts back into text.clu and modify swordres.rif. * The new files will be created with the _new suffix. Tested with GoG (2.0.0.8) and Steam Reforged Free DLC FEARka3 points -
Thanks for some info from here and made a tool for unpacking and packing localize map files, if someone is interested in it. https://github.com/dest1yo/wwm_utils2 points
-
Version 1.1
499 downloads
Tools for Battlefield 6 beta. Currently supports dumping the game, export models/maps. Usage is similar to previous tools for frostbite engine. toc_bf6.exe - dump tool Change .ini file parameters: - game path - dump path - selection to dump "ebx", "res", "chunks" or "all" Then drop any .toc file onto .exe to dump assets. Or run from command line with 1 parameter - toc file name. Fb_bf6_mesh.exe - model tool Takes .MeshSet as parameter. ske_soldier_3p.ebx - main universal skeleton for soldiers. Must be in the same folder. If you need another skeleton, use its name as 2nd parameter. Or rename it to ske_soldier_3p.ebx. Tool will try to find chunks automatically. If not, it gives error message with chunk name. Map export 1. Create database Run fb_maps_bf6_db.exe tool once, it will scan whole dump for meshsets and blueprints, so later maps can be converted fast, without the need to go into whole tree of assets. This will take a few minutes. After that, 2 files will be created: bp.db & meshnames.txt, which need to stay in the same folder with EXE for main tool to work. 2. Export maps Use fb_maps_bf6.exe (main map tool) to convert maps. Drop any EBX on it, use in command line with 1 parameter, or create a batch. 3. Terrain export Main terrain data is in .TerrainStreamingTree files for each level. For some levels, these files are small, which means the actual data is in chunks. Sometimes data is in the file itself, in this case it may be big, about 50mb in size. Drop .TerrainStreamingTree on fb_terrain_bf6.exe or use command line.2 points -
2 points
-
2 points
-
The game have update and they hard-coded new text in .mpk lua script, because some words have many different meaning depend on the context. With packet sniffing, i observed that the game download some .pak file from easebar.com and put them in .mpk file. These file are encrypted.1 point
-
I made a blender addon to import models, textures and animations for dolphin wave and other games that used the same engine. it can import lzs and lza files as is. You don't need to decrypt or decompress the files https://github.com/Al-Hydra/blenderBUM1 point
-
Here my analysis: Header: 24 bytes: [ Int64 EntryCount Int64 ValueCount Int32 Timestamp Int32 Padding ] Buckets: [24-528] bytes, based on allocated bucket TableEntries: EntryCount * [ 8 Bytes Hash(or id?), Int32 RelativeOffset, (formula: text_start = current_entry_offset + 8 + value) Int32 TextLength ] Values: ValueCount * [ Byte[ValueLength] Data ] Null value have zero length and no hash. Successfully unpack and pack, the game load new text normally.1 point
-
When i get home, i will compile the decompressor/compressor unpack and pck tool, is one all tool. std::vector<uint8_t> compressLZSSBlock(const std::vector<uint8_t>& input) { const int MIN_MATCH = 3; // comprimento mínimo para virar par const int MAX_MATCH = 17; // (0xF + 2) const int DICT_SIZE = 4096; const size_t n = input.size(); // Dicionário igual ao do descompressor std::vector<uint8_t> dict_buf(DICT_SIZE, 0); size_t dict_index = 1; // mesmo índice inicial do descompressor size_t producedBytes = 0; // quantos bytes já foram "gerados" (saída lógica) std::vector<uint32_t> flagWords; uint32_t curFlag = 0; int bitsUsed = 0; auto pushFlagBit = [&](bool isLiteral) { if (bitsUsed == 32) { flagWords.push_back(curFlag); curFlag = 0; bitsUsed = 0; } if (isLiteral) { // bit 1 = literal (mesmo significado do descompressor) curFlag |= (1u << (31 - bitsUsed)); } ++bitsUsed; }; std::vector<uint8_t> literals; std::vector<uint8_t> pairs; literals.reserve(n); pairs.reserve(n / 2 + 16); size_t pos = 0; while (pos < n) { size_t bestLen = 0; uint16_t bestOffset = 0; if (producedBytes > 0) { // tamanho máximo possível para este match (não pode passar do fim do input) const size_t maxMatchGlobal = std::min(static_cast<size_t>(MAX_MATCH), n - pos); // percorre todos os offsets possíveis do dicionário for (int off = 1; off < DICT_SIZE; ++off) { if (dict_buf[off] != input[pos]) continue; // --- SIMULAÇÃO DINÂMICA DO DESCOMPRESSOR PARA ESTE OFFSET --- uint8_t candidateBytes[MAX_MATCH]; size_t candidateLen = 0; for (size_t l = 0; l < maxMatchGlobal; ++l) { const int src_index = (off + static_cast<int>(l)) & 0x0FFF; // valor em src_index, levando em conta que o próprio bloco // pode sobrescrever posições do dicionário (overlap) uint8_t b = dict_buf[src_index]; // Se src_index for igual a algum índice de escrita deste MESMO par // (dict_index + j), usamos o byte já "gerado" candidateBytes[j] for (size_t j = 0; j < l; ++j) { const int dest_index = (static_cast<int>(dict_index) + static_cast<int>(j)) & 0x0FFF; if (dest_index == src_index) { b = candidateBytes[j]; break; } } if (b != input[pos + l]) { // não bate com o input, para por aqui break; } candidateBytes[l] = b; ++candidateLen; } if (candidateLen >= static_cast<size_t>(MIN_MATCH) && candidateLen > bestLen) { bestLen = candidateLen; bestOffset = static_cast<uint16_t>(off); if (bestLen == static_cast<size_t>(MAX_MATCH)) break; // não tem como melhorar } } } if (bestLen >= static_cast<size_t>(MIN_MATCH)) { // --- CODIFICA COMO PAR (offset, length) --- pushFlagBit(false); // 0 = par uint16_t lengthField = static_cast<uint16_t>(bestLen - 2); // 1..15 uint16_t pairVal = static_cast<uint16_t>((bestOffset << 4) | (lengthField & 0x0F)); pairs.push_back(static_cast<uint8_t>(pairVal & 0xFF)); pairs.push_back(static_cast<uint8_t>((pairVal >> 8) & 0xFF)); // Atualiza o dicionário exatamente como o DESCOMPRESSOR: // for (i = 0; i < length; ++i) { // b = dict[(offset + i) & 0xFFF]; // out.push_back(b); // dict[dict_index] = b; // dict_index = (dict_index + 1) & 0xFFF; // } for (size_t i = 0; i < bestLen; ++i) { int src_index = (bestOffset + static_cast<uint16_t>(i)) & 0x0FFF; uint8_t b = dict_buf[src_index]; dict_buf[dict_index] = b; dict_index = (dict_index + 1) & 0x0FFF; } pos += bestLen; producedBytes += bestLen; } else { // --- LITERAL SIMPLES --- pushFlagBit(true); // 1 = literal uint8_t literal = input[pos]; literals.push_back(literal); dict_buf[dict_index] = literal; dict_index = (dict_index + 1) & 0x0FFF; ++pos; ++producedBytes; } } // Par terminador (offset == 0) pushFlagBit(false); pairs.push_back(0); pairs.push_back(0); // Flush do último flagWord if (bitsUsed > 0) { flagWords.push_back(curFlag); } // Monta o bloco final: [u32 off_literals][u32 off_pairs][flags...][literais...][pares...] const size_t off_literals = 8 + flagWords.size() * 4; const size_t off_pairs = off_literals + literals.size(); const size_t totalSize = off_pairs + pairs.size(); std::vector<uint8_t> block(totalSize); auto write_u32_le = [&](size_t pos, uint32_t v) { block[pos + 0] = static_cast<uint8_t>(v & 0xFF); block[pos + 1] = static_cast<uint8_t>((v >> 8) & 0xFF); block[pos + 2] = static_cast<uint8_t>((v >> 16) & 0xFF); block[pos + 3] = static_cast<uint8_t>((v >> 24) & 0xFF); }; write_u32_le(0, static_cast<uint32_t>(off_literals)); write_u32_le(4, static_cast<uint32_t>(off_pairs)); size_t p = 8; for (uint32_t w : flagWords) { block[p + 0] = static_cast<uint8_t>(w & 0xFF); block[p + 1] = static_cast<uint8_t>((w >> 8) & 0xFF); block[p + 2] = static_cast<uint8_t>((w >> 16) & 0xFF); block[p + 3] = static_cast<uint8_t>((w >> 24) & 0xFF); p += 4; } std::copy(literals.begin(), literals.end(), block.begin() + off_literals); std::copy(pairs.begin(), pairs.end(), block.begin() + off_pairs); return block; } @morrigan my compressor, try it, and let me know the results.1 point
-
Yea, I'm working on BHD but mostly focused on the JO/DFX2 engine which is slightly newer and a different format. I'll post here when/if I get BHD usable.1 point
-
.ilv.txth: codec = PSX channels = 2 sample_rate = 44100 interleave = 0x4000 num_samples = data_size1 point
-
You can either use this QuickBMS script to extract the msv audio files out of the rp2: get UNK long get FILES long goto 0x20 for i = 0 < FILES getdstring NAME 7 getdstring DUMMY 25 get OFFSET long get SIZE long get DUMMY2 long string NAME + ".msv" log NAME OFFSET SIZE next i Or you can use this txth file to play the audios out of the rp2 directly (needs vgmstream + an audio player like foobar2000): subsong_count = @0x04 subsong_spacing = 0x2c base_offset = 0x20 name_offset = 0x00 subfile_offset = @0x20 subfile_size = @0x24 subfile_extension = msv Save the text above as ".rp2.txth" and put it on the same directory as the rp2 file. Also if you're using foobar2000, make sure to check "Enable unknown exts" on the vgmstream preferences page.1 point
-
1 point
-
1 point
-
Well, I did a little research on Flash Cookies (SOL files) and I put it all together in the article on RE Wiki https://rewiki.miraheze.org/wiki/Flash_Cookie_SOL I saw notes on your github and you were sligthly wrong with some fields, so you can compare it with my article on the wiki and make some corrections in your tool. The most important thing is that you should understand that SOL file is an Adobe format and payload (data block) follows AMF file format documented by Adobe https://web.archive.org/web/20220122035930/https://www.adobe.com/content/dam/acom/en/devnet/pdf/amf-file-format-spec.pdf So anything after data block header is a payload section that needs to be properly serialized by your tool. There are many tools that allow you proper serialization like: minerva, SOL Editor, Adobe AIR SDK, JPEXS Free Flash Decompiler etc. Some code for serializing is available on JPEXS github page: https://github.com/jindrapetrik/jpexs-decompiler/tree/master/libsrc/ffdec_lib/src/com/jpexs/decompiler/flash/sol https://github.com/jindrapetrik/jpexs-decompiler/tree/master/libsrc/ffdec_lib/src/com/jpexs/decompiler/flash/amf/amf3 You can test this code by going to Tools > Sol cookie editor in JPEXS Free Flash Decompiler: So you shouldn't ask "what are those three bytes". You should ask "how can I properly parse AMF3 serialized data" 🙂 There are lots of information (articles) about this, for example on wikipedia: https://en.wikipedia.org/wiki/Local_shared_object https://en.wikipedia.org/wiki/Action_Message_Format Good luck. 🙂1 point
-
1 point
-
What I'm trying to do is create a new texture and trying to add an alpha channel while hex editing. The Zip File I provided is mostly for demonstration purposes. I'm trying to edit NINJA_FACE_DAMAGE_000, NINJA_FACE_DAMAGE_001, and NINJA_FACE_DAMAGE_002. My biggest problem is that the textures I'm trying to add an alpha channel to is different than it's original textures and is swizzled differently thus the alpha channel will be different. I am aware the image data will always start at 592 (0x250) and palette data differs depending on the size of the texture 64x64 (0x12A0), 128x128 (0x42A0), 256x256 (0x102D0). Anyways, I was able to use ImageHeat to get an alpha channel from the original textures with PAL8 pixel and RGBA8888 palette and exported it. Also, was able to use the ReverseBox Demo 2 to export and import the original textures just to get familiar with it. These are just a few textures that I'm trying to insert into the game and the alpha channels. MKD Texture Edit.zip1 point
-
Here's a sample model for one of the enemies in the game. Notice that "*_div.msb" can't be view/export properly for some reason, but the base one did just fine. PSVitaSample.zip1 point
-
fmt_psaVita_ValkyrieDrive.py Here's a old noesis plugin to view and export most of the mib, msb and mab of the PSVisa version of the game.1 point
-
Edit - just tested it and no 4 mrts is uv, you was right in saying the 4th one is the uv maps by the rule1 point
-
I have one as an example I notice in this section Ran XXD with groups of 12 and noticed a pattern. This is the pinky intermediate joint. It is known that the pinky intermediate joint has one degree of freedom. meaning that movements in the other two DoF should be minimal. The second set of each float is stable. 16 bit floats. Little endian likely. Meaning that [3f], [00], [ff], etc. is the major bits. Given the ffs I do not think it is Big Endian with an offset. I have attached the file in question so you can look yourself EDIT: These are signed LE numbers. Circular angular floats (not IEEE 754 standard) so ffff/0000, 3fff/4000, 7fff/8000, bfff/c000 are each 90 degrees apart. EDIT 2: It could also be a LUT. But then I checked for any tables and I can't find any useful. EDIT 3: I have no clue anymore. These are proprietary obfuscated numbers using some cryptic format and if anyone knows how to decode them it would be absolutely amazing. SVT_0015_S01_ATK_A01.zip1 point
-
1 point
-
I tried using the head models and tested two of them (DDP and Rovyal). I was able to get the models to work. From my understanding, all of the heads share the same offsets and parameters, so if you can get one model to construct with no weird faces, you can use the same parameters for the other models. I’m just having an issue with the ears, as you can see in the photo below. If you could share your parameters and offsets, I think it would be very helpful. update 1 : got it woriking however the uvs are not working but you can transfer it using copy vertex order plugin in blender UPDATE 2: got the uvs to work for the model using these params but i still have to go inside blender uv unwrap and then copy uv maps to the main model . is there a way i can get uv maps and model all in 1 in hex2obj ? sorry if this is a rookie question since ive never used hex 2 obj before Final Update : I just have to save the mesh from hex2obj settings and the uvs will be added to the obj. thanks for your help Hex2Obj1 point
-
1 point
-
Well, use my old c++ tool, it should work now, tried to rewrote in python for training python syntax, and maybe i did something wrong. ZstdMagicExtractor.zip ZstdMagicExtractor-release version.zip1 point
-
1 point
-
@ikskoks Thank you for this script, I also was able to use it successfully. I had a question about the outputted `.kiw` files, there is bytecode in each file that corresponds to in game instructions like character dialogue expressions, choices, background images and music cues. What is the best way to reverse engineer the bytecode in these files? I am not sure of the game engine that made this but are there any existing parsers or scripts for `.kiw` or similar you know of?1 point
-
1 point
-
Because the fmlb and sound file does exits anymore because when before Game shutdown that files are dynamic content but some files like that are available in beta versión APK and obb but no all files1 point
-
I have released an early version of the tool that can do just meshes with their material names/skeleton:1 point
-
You can use my latest Fmod Bank Tools - https://www.nexusmods.com/rugbyleaguelive3/mods/2 https://github.com/Wouldubeinta/Fmod-Bank-Tools Just PM me for the bank password.1 point
-
i update my plugin : fmt_DS2_PS3_geo.py *(The uvs file must be located in the same folder as the model, either in the "MeshVolatile" subfolder or next to the model.)1 point
-
(just my old plugin that was made from example files of another theme) fmt_kn5.py *(don't support encryption)1 point
-
Please use this updated script to repackage the data file. If you have any questions, please let me know so that other capable people or you can continue to process these .pxc files yourself # Update the decompression of pxc file(script 0.2) get FILE_SIZE asize xmath TOC_PTR "FILE_SIZE - 8" goto TOC_PTR get TOC_OFFSET long goto TOC_OFFSET get FILE_COUNT long for i = 0 < FILE_COUNT get OFFSET long get SIZE long get COMP_FLAG byte get NAME_LEN short getdstring NAME NAME_LEN get UNK long savepos TOC_ENTRY_POS if COMP_FLAG == 0 goto OFFSET getdstring MAGIC 4 if MAGIC == "PxZP" comtype zlib get UNCOMP_SIZE long get COMP_SIZE long savepos DATA_START clog NAME DATA_START COMP_SIZE UNCOMP_SIZE else log NAME OFFSET SIZE endif else goto OFFSET get MAGIC long get UNCOMP_SIZE long get COMP_SIZE long savepos COMP_START clog NAME COMP_START COMP_SIZE UNCOMP_SIZE endif goto TOC_ENTRY_POS next i pxc.zip1 point
-
1 point
-
Bumping this, if anyone would be an absolute unit to solve the animations it would be greatly appreciated! 🙃1 point
-
Does anyone know more about the mcd format used by EA? Models are packed inside the rsf format and it is possible to extract them with the rsf.bms script. I managed to make a Noesis script to view the models, but reading the skeletons is hard for me and I don't know how to proceed with them. Vertex weights are stored in a table that starts with the tag TIEW in mcd and skeleton data in FRGS in .skel file. I would appreciate any kind of help to import the skeletons into noesis. Preview in Noesis: https://imgur.com/a/o2dQvCl fmt_NHL21_mcd.py rsf.bms.zip nhl21model.zip1 point
-
my plugin for vfs work with your file EDIT: and i made preview plugin for *.sggr fmt_sggr.py (*.pvr its image, use pvrTexTool)1 point
-
Seems the game dont accepts a different zlib levels Maybe using levels 0-9 and try. use level 9, the compression file will be the same as original! https://drive.google.com/file/d/11rON0JaDswJCQJ-RBF2USKErQRtPbP_I/view?usp=sharing and maybe solution post;1 point
-
Did you ever figure out the animations format? I'd love to get access to the animations for some stuff but of course, the MOT files are formatted differently 😔1 point
-
zlib_DeCompressor.pyHere the DeCompressor update, now works with every file1 point
-
Mostly, PNG files are decompressed, so it's fairly easy to edit and reimport them. However, you need to compress your PNG files using the site iLoveIMG. For example, if the original PNG is 10 KB, your PNG must be 10 KB or less, so you will need to compress it on the site. I’ve attached 3 files: Two BMS scripts: One script will unpack the data, decompressing all files. The other script will unpack the file without decompressing it (this is the one you should use for reimporting; reimport with -r, not reimport 2 in quick bms). A Python script (.py): This program decompresses and compresses zlib files individually. I have set a compression level to reduce the file size even further. Use this if you need to handle compressed files. zlib_DeCompressor.py BMS.ZIP1 point
-
1 point
-
Maybe I didn’t quite understand your goal, but you can’t just take files from the game and compile them back into the game.) Edit: ah, I read a few posts above, you need to compile Asset Studio, not the game files.1 point
-
I'm still not sure how to run it, unless I'm doing something wrong, it doesn't seem to be working1 point
-
yeah i guess so. but I think we need to find something to differentiate between rotate data or transform data, and it's hard to find that, so we need something reverse work1 point
-
Has anyone managed to extract the trading card images? I tried using the script for the 3D models. but it just doesn't work.1 point
ResHax.com: Empowering Curious Minds in the World of Reverse Engineering
Delving into the Art of Code Unraveling: ResHax.com - Your Gateway to the Thrilling World of Reverse Engineering, Where Curiosity Meets Innovation!