Leaderboard
Popular Content
Showing content with the highest reputation since 11/01/2025 in all areas
-
5 points
-
4 points
-
3 points
-
Version 1.0.0
17 downloads
Broken Sword - Shadow of the Templars (1996) * bs1_1996_clu_export.py * bs1_1996_clu_import_and_patch.py Required: Python. When installing, make sure to check “Add python.exe to PATH.” Usage: * Copy the swordres.rif and text.clu files into the same folder as the .py files. * Run bs1_1996_clu_export.py: this will extract the texts into Text_exported.txt. * Translate it, then rename the finished file to Text_translated.txt. * Run bs1_1996_clu_import_and_patch.py: this will insert the translated texts back into text.clu and modify swordres.rif. * The new files will be created with the _new suffix. Tested with GoG (2.0.0.8) and Steam Reforged Free DLC FEARka3 points -
I've just released new version of ImageHeat 🙂 https://github.com/bartlomiejduda/ImageHeat/releases/tag/v0.39.1 Changelog: - Added new Nintendo Switch unswizzle modes (2_16 and 4_16) - Added support for PSP_DXT1/PSP_DXT3/PSP_DXT5/BGR5A3 pixel formats - Fixed issue with unswizzling 4-bit GameCube/WII textures - Added support for hex offsets (thanks to @MrIkso ) - Moved image rendering logic to new thread (thanks to @MrIkso ) - Added Ukrainian language (thanks to @MrIkso ) - Added support for LZ4 block decompression - Added Portuguese Brazillian language (thanks to @lobonintendista ) - Fixed ALPHA_16X decoding - Adjusted GRAY4/GRAY8 naming - Added support section in readme file2 points
-
2 points
-
Thanks for some info from here and made a tool for unpacking and packing localize map files, if someone is interested in it. https://github.com/dest1yo/wwm_utils2 points
-
Version 1.1
515 downloads
Tools for Battlefield 6 beta. Currently supports dumping the game, export models/maps. Usage is similar to previous tools for frostbite engine. toc_bf6.exe - dump tool Change .ini file parameters: - game path - dump path - selection to dump "ebx", "res", "chunks" or "all" Then drop any .toc file onto .exe to dump assets. Or run from command line with 1 parameter - toc file name. Fb_bf6_mesh.exe - model tool Takes .MeshSet as parameter. ske_soldier_3p.ebx - main universal skeleton for soldiers. Must be in the same folder. If you need another skeleton, use its name as 2nd parameter. Or rename it to ske_soldier_3p.ebx. Tool will try to find chunks automatically. If not, it gives error message with chunk name. Map export 1. Create database Run fb_maps_bf6_db.exe tool once, it will scan whole dump for meshsets and blueprints, so later maps can be converted fast, without the need to go into whole tree of assets. This will take a few minutes. After that, 2 files will be created: bp.db & meshnames.txt, which need to stay in the same folder with EXE for main tool to work. 2. Export maps Use fb_maps_bf6.exe (main map tool) to convert maps. Drop any EBX on it, use in command line with 1 parameter, or create a batch. 3. Terrain export Main terrain data is in .TerrainStreamingTree files for each level. For some levels, these files are small, which means the actual data is in chunks. Sometimes data is in the file itself, in this case it may be big, about 50mb in size. Drop .TerrainStreamingTree on fb_terrain_bf6.exe or use command line.2 points -
ImageHeat v0.39.2 (HOTFIX) https://github.com/bartlomiejduda/ImageHeat/releases/tag/v0.39.2 Changes: - Fixed hex input - Changed endianess and pixel formats bindings (required for hex input fix)1 point
-
So here's pixel format for ps4. PF_DXT5 = 7 PF_DXT1 = 13 PF_BC7U = 22 PF_UNKNOWN = 2 not sure but can be RGBA1 point
-
I remember to make a request in your github about it. 👍 Somehow, we were not able to see these textures in ImageHeat, only after extraction and decompression. Anyway, for the Switch textures it seems to be an issue as h3x3r said above and I confirm it too. In the attachment you find all the textures in UNIFORM.TEX (including jersey-color) from the Switch version already decompressed. The stock texture file is in the Switch files in the first post (UNIFORM.TEX). In the screenshot below you see the parameters for the jersey-color texture. Maybe useful when you have time to check it to help you fix ImageHeat. UNIFORM Switch decompressed.zip1 point
-
1 point
-
The game have update and they hard-coded new text in .mpk lua script, because some words have many different meaning depend on the context. With packet sniffing, i observed that the game download some .pak file from easebar.com and put them in .mpk file. These file are encrypted.1 point
-
It's Unity, but seems to have a protection layer so it can't be opened in Asset Studio. Game Assembly: https://www.mediafire.com/file/3i7kvobi4nacnbh/GameAssembly.zip/file THO.zip1 point
-
I made a blender addon to import models, textures and animations for dolphin wave and other games that used the same engine. it can import lzs and lza files as is. You don't need to decrypt or decompress the files https://github.com/Al-Hydra/blenderBUM1 point
-
Hello Ikskoks! Thank for the solution. I have seen most of your links long ago but because it says nothing about the "mysterious bytes" after the string "grid", I fall in the conclusion that SOL Files isn't documentation enough. Crazy though, I wouldn't care about where the AMF format bytes data is, as long as the script reproduces a SOL file format and work on the Flash game in question is what matter, like I did with DS-nitro-files-builder. Regardless, this is clearer now. So, there are SOL Editors. My Python project is useless!1 point
-
In the ..var01.st2 csv data is contained: edit: and xml data: <!-- ______________________________________________________________________________ Copyright 2004 The Collective, Inc. DISMEMBERMENT DEFINITION Character: Clone Trooper Author: Baback Elmieh Date: 01/07/2004 ______________________________________________________________________________ --> <!-- HEAD --> <DismemberablePart Name="Head" Hitpoints="25"> <!-- The Materials section is a list of materials in the original mesh that are to be turned off when the part is dismembered --> <!-- <Materials> <Material Name="headSG"/> </Materials> --> <!-- ReactionProcessing defines the chunks and particles to be spawned when a reaction dismemberment is processed for the character the definition requires a Bone from which a chunk should be spawned and the name of the chunkmesh. The ChunkMesh definition in turn can have several values set such as GravityScale and UseGinFile. GravityScale greater than 1.0 pulls a chunk down faster, UseGinFile will look for a bounding box with the same name as the chunkmesh in the damage mesh's gin file, if the bound is found, it is used instead of the default rendering bound which can help artists orientate chunks so that they land on their correct side --> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="neck_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <!-- <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="neck_g"> <Param Name="ChunkMesh" Value="head"/> <Param Name="GravityScale" Value="1.4"/> <Param Name="RandomVelocityScale" Value="0.2"/> </Param> </Chunk> --> </ReactionProcessing> <!-- The Capsules section provides a list of capsules that should affect the hitpoint of the part and should be disabled once the chunk has been dismembered --> <Capsules> <Capsule Name="Dneck_g"/> </Capsules> </DismemberablePart> <!-- LEFT SHOULDER --> <DismemberablePart Name="Left Shoulder" Hitpoints="25"> <Materials> <Material Name="Shoulder_LSG"/> </Materials> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="shoulder_L_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="shoulder_L_g"> <Param Name="ChunkMesh" Value="Shoulder_L"/> <Param Name="GravityScale" Value="1.8"/> <Param Name="RandomVelocityScale" Value="0.1"/> </Param> </Chunk> </ReactionProcessing> <Capsules> <Capsule Name="Dshoulder_L_g"/> </Capsules> </DismemberablePart> <!-- RIGHT ELBOW --> <DismemberablePart Name="Right Shoulder" Hitpoints="25"> <Materials> <Material Name="Elbow_RSG"/> </Materials> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="shoulder_R_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="shoulder_R_g"> <Param Name="ChunkMesh" Value="Elbow_R"/> <Param Name="GravityScale" Value="1.8"/> <Param Name="RandomVelocityScale" Value="0.1"/> </Param> </Chunk> </ReactionProcessing> <Capsules> <Capsule Name="Dshoulder_R_g"/> </Capsules> </DismemberablePart> </DismembermentDefinition>1 point
-
Yea, I'm working on BHD but mostly focused on the JO/DFX2 engine which is slightly newer and a different format. I'll post here when/if I get BHD usable.1 point
-
Just found these forums, that's my github in the OP. Happy to help. This may help you too https://github.com/taylorfinnell/on3diimporter/blob/main/on3diimporter.py1 point
-
.ilv.txth: codec = PSX channels = 2 sample_rate = 44100 interleave = 0x4000 num_samples = data_size1 point
-
1 point
-
I did the descompressor and compressor but in c++, need to test the compressor thought, because is compressing better than the original. Test in game and show the results, if works. MACROSS_PS1_TOOL.zip1 point
-
fmt_psaVita_ValkyrieDrive.py Here's a old noesis plugin to view and export most of the mib, msb and mab of the PSVisa version of the game.1 point
-
rename it to .awb files then use the lastest vgmstream, works well1 point
-
They are still pck files. I can find many wwise .bnk files in AA462ABBFEC319B665666E14585F97D9_EndfieldBeta with ravioli explorer , RavioliGameTools_v2.10.zip (if you need)and I think quickbms also work. By the way I guess the really wem audio files are in another pck files. there are over 5000 bnk files in AA462ABBFEC319B665666E14585F97D9_EndfieldBeta. That means the bnk files may not store any actual audio files1 point
-
# Atlas Fallen (Fledge Engine) # script for QuickBMS http://quickbms.aluigi.org comtype lz4f open FDDE "toc" get DUMMY long get SOME_CRC long callfunction GET_INFO 1 get SIZE long endian big get SOME_CRC long callfunction GET_INFO 1 get SOME_CRC long callfunction GET_INFO 1 callfunction GET_INFO 1 get DUMMY byte # 1 get DUMMY byte # 1 get DUMMY byte # 1 get DUMMY long # 1 get FILES long string NAME p "%s_%d.dat" NAME 0 open FDSE NAME 1 for i = 0 < FILES get SOME_CRC long callfunction GET_INFO 1 get OFFSET longlong get ZSIZE long get SIZE long get DUMMY long get DUMMY long get DUMMY short clog NAME OFFSET ZSIZE SIZE 1 next i startfunction GET_INFO get ZERO long get DUMMY long savepos TMP get NAMESZ long if NAMESZ & 0x80000000 math NAMESZ & 0x7fffffff endif getdstring NAME NAMESZ padding 4 0 TMP endfunction1 point
-
I have one as an example I notice in this section Ran XXD with groups of 12 and noticed a pattern. This is the pinky intermediate joint. It is known that the pinky intermediate joint has one degree of freedom. meaning that movements in the other two DoF should be minimal. The second set of each float is stable. 16 bit floats. Little endian likely. Meaning that [3f], [00], [ff], etc. is the major bits. Given the ffs I do not think it is Big Endian with an offset. I have attached the file in question so you can look yourself EDIT: These are signed LE numbers. Circular angular floats (not IEEE 754 standard) so ffff/0000, 3fff/4000, 7fff/8000, bfff/c000 are each 90 degrees apart. EDIT 2: It could also be a LUT. But then I checked for any tables and I can't find any useful. EDIT 3: I have no clue anymore. These are proprietary obfuscated numbers using some cryptic format and if anyone knows how to decode them it would be absolutely amazing. SVT_0015_S01_ATK_A01.zip1 point
-
Hi experts. I'm trying to read the MT2 images from Indiana Jones and the Emperor's Tomb (PS2), but I'm stuck trying to piece it all together. The images in the PC version are 32-bit RGBA, so that's easy, and gives us a comparison. For the same image in the PS2 version, it seems to be broken down like this... 64 bytes - basic image data (filename, width, height, etc) width*height/2 - 4-bit pixel values (with 4bit PS2 swizzling) width*height/8 - color values This is what the pixel block looks like, as 4-bit values in grayscale: The color values in the last block, look to be something similar to RGBA5551, so even though the length of this block is width*height/8, as they are 16-bit colors, there are actually only width*height/16 colors. This is what the color block looks like when read as RGBA5551 color values: ... you can see the image in that "color block". I know it's not quite the right colors, but think it might just need some color striping applied to it, or it might not be exactly RGBA5551. As the width/height increase, so does the size of the color block, so it's not a plain palette, it's proportional to the image size. For example, for a 128x128 image, the color block is 2048 bytes. For a 256x256 image, the color block is 8192 bytes. I'm struggling to work out how to join the "pixel" block and the "color" block together so that we end up with a usable image that looks similar to the PC image. I'm assuming maybe we need to generate some in-between colors like in a DDS image, or otherwise apply some kind of "intensity" to the colors or something like that. I've never seen anything like this before - is anyone able to assist in understanding this please? I have attached a ZIP with the PC image, the PS2 image file, and PNGs for both the "pixel" block and the "color" block, so you can clearly see there is a correlation between the 2 blocks. Thanks for your help! vinehead.zip1 point
-
1 point
-
1 point
-
Well, use my old c++ tool, it should work now, tried to rewrote in python for training python syntax, and maybe i did something wrong. ZstdMagicExtractor.zip ZstdMagicExtractor-release version.zip1 point
-
1 point
-
It's not just 1 block of data, there are multiple compressed ZSTD blocks in your sample file that have to be joined together - e.g. at 0, 0x129b0, 0x31dd0, etc.. It looks as though each file is preceded by the compressed size and anotherr value, except the first block, which looks to be a compressed size of 0x129a0. You might have cut that bit off in your sample. Each block seems to decompress to 0x40000 bytes except for the last one, which is shorter. I guess the header might have some useful info.1 point
-
Hey all, I also recently got interested in modding the original QP Shooting. I'm currently working on a command-line tool that so far allows for extracting and repacking the LAG assets, with decoding and encoding of dialogue/system files also now planned to be implemented. However, it seems after reviewing this thread that I wrongfully assumed that the graphics were red-blue-swapped A16B16G16R16 DDS surfaces rather than a special Luna/LAG image format... so that's probably another thing I need to fix up (although the assets can be modified fine with an editor that supports that DDS format once the header is written, so maybe it is just a slightly tweaked version of DDS). I'll post the GitHub link here when I polish and finish it up : )1 point
-
Because the fmlb and sound file does exits anymore because when before Game shutdown that files are dynamic content but some files like that are available in beta versión APK and obb but no all files1 point
-
1 point
-
1 point
-
Please use this updated script to repackage the data file. If you have any questions, please let me know so that other capable people or you can continue to process these .pxc files yourself # Update the decompression of pxc file(script 0.2) get FILE_SIZE asize xmath TOC_PTR "FILE_SIZE - 8" goto TOC_PTR get TOC_OFFSET long goto TOC_OFFSET get FILE_COUNT long for i = 0 < FILE_COUNT get OFFSET long get SIZE long get COMP_FLAG byte get NAME_LEN short getdstring NAME NAME_LEN get UNK long savepos TOC_ENTRY_POS if COMP_FLAG == 0 goto OFFSET getdstring MAGIC 4 if MAGIC == "PxZP" comtype zlib get UNCOMP_SIZE long get COMP_SIZE long savepos DATA_START clog NAME DATA_START COMP_SIZE UNCOMP_SIZE else log NAME OFFSET SIZE endif else goto OFFSET get MAGIC long get UNCOMP_SIZE long get COMP_SIZE long savepos COMP_START clog NAME COMP_START COMP_SIZE UNCOMP_SIZE endif goto TOC_ENTRY_POS next i pxc.zip1 point
-
Version 0.0.2
17 downloads
An addon for Blender 4.3.0 (also tested with 4.4.3) to import and export the .msh, .bn (.bbx goes together) and .ani files for RF Online. The entity (R3E) and map (BSP) formats are import only. Import operations work with drag and drop. There is code for exporting the BSP format inside the addon code but it is deactivated due to being incomplete. It only reaches so far as actually exporting walkable map geometry (with the BSP structure also built) and baking+exporting the light maps. Unfortunately, Blender proved to not be very suitable for the task of actually being a complete map editor for RF Online, mostly due to complexity issues with the .SPT particle format and other desirable features that would be hard to implement into it, such as mob spawn areas and portals. The R3M materials are also quite hard to simulate, since the original engine rendered the same mesh multiple times for each texture layer they had. It is possible to reactivate the feature by manually uncommenting the three commented lines in the bsp.py's menu_func_export, register and unregister functions. Expect no support for this feature, as the more proper solution would be writing a proper dedicated software. Current Features: MSH (Mesh) Import: Imports .msh static meshes (Standard and MESH08 formats). Automatically attempts to find and assign textures by looking for DDS files referenced in the mesh or by searching .RFS archives in expected relative paths (../Tex/). MSH (Mesh) Export: Exports selected Blender mesh(es) to .msh format (Standard or MESH08). Handles vertex data, UVs, weights, and bone assignments. The export ignores any collection with the name "bone shapes". BN (Skeleton) Import: Imports .bn skeleton files. Reads bone hierarchy and rest poses. Automatically looks for the corresponding .bbx file (must be same name, same folder) to get the proper skeleton name. Creates Blender Armature objects. Also imports custom bone shape geometry if defined in the BN file and creates mesh objects for them, assigning them as custom shapes in Blender. BN (Skeleton) Export: Exports a selected Blender Armature to .bn format. Calculates and exports the corresponding .bbx file with skeleton name and bounding box. Exports custom bone shape geometry if assigned. ANI (Animation) Import: Imports .ani animation files. Applies animations to compatible Armatures and/or Objects based on names found in the ANI file. Creates Blender Actions. Option to target selected objects or objects within a collection matching the ANI's base name. ANI (Animation) Export: Exports Blender Actions to .ani format. Bakes complex animations (constraints, drivers, NLA) before export. Options to export the active action, actions from selected objects, actions from the active collection, or all scene actions. BSP (Map) Import: Imports .bsp map geometry. Reads associated .r3m (materials), .r3t (textures), and .ebp (entities, collision) files (must be same base name, same folder). Locates entity assets by parsing .rpk archives found in ../Entity/ relative to the BSP's directory. Instantiates map geometry, materials (replicating many R3M effects), and R3E entities. Includes an option to import and display LDR lightmaps from Lgt.r3t files. There is also an option for creating a visualization of the actual BSP structure of the map by creating boxes with the nodes' dimensions and leaves with the appropriate geometry, however this will most certainly make the Blender scene run very slow (this option is not necessary to see the actual map at all if that's what you want). R3E (Entity) Import: Imports .r3e files together with their associated .r3m and .r3t files. Also imports animations, if present. Installation: Download the repository as a .zip file. Or simply download the embed file here. In Blender, go to Edit > Preferences > Add-ons. Click Install... and select the downloaded .zip file. Enable the "RF Online importer/exporter" addon by checking the box next to it. Dependencies (only necessary if you want to manually try the BSP export option) DDS Export (.bsp): Exporting BSPs requires ImageMagick to be installed and accessible in your system's PATH. The addon uses it to convert textures to DDS format. Download from: https://imagemagick.org/script/download.php Important: During installation, ensure you check the option to "Install legacy utilities (e.g., convert)" as the addon uses the magick convert command. How to Use: Import: Find the RF Online importers under File > Import > ... (MSH, BN Skeleton, ANI, BSP, R3E). Export: Find the RF Online exporters under File > Export > ... (MSH, BN Skeleton, ANI). Operator Options: Each operator has options. Pay attention to options like: MSH Export: Mesh Format to Export (Standard/MESH08), Collection Type to Export. ANI Import: Apply to Selected Objects, Ignore Not Found Objects. ANI Export: Action(s) to Export. BN Export: Export only selected. Debug options are available for troubleshooting. If turned on, open Blender's console to see the messages. Expected File Structure & Naming Conventions The addon relies on specific file names and relative folder locations to find associated assets: BSP Import (map.bsp): Needs map.r3m, map.r3t, mapLgt.r3t (optional), map.ebp in the same folder. Needs entity RPK archives (e.g., entity.rpk, monster.rpk) located in ../Entity/ relative to the map.bsp folder. The addon parses these RPKs to find the .r3e, .r3m, .r3t, etc., files for map entities. MSH Import (mesh.msh): Will look for texture paths defined within the MSH. If not found directly, it attempts to find textures in .rfs archives located in ../Tex/ relative to the .msh file's folder. BN Import (skeleton.bn): Needs skeleton.bbx in the same folder to read the proper skeleton name and overall bounding box. Export Naming:MSH Export: Selected Objects: Uses the filename you provide in the export dialog (e.g., my_export.msh). Active Collection / All Collections: Uses the collection name as the base filename in the selected directory (e.g., exporting a collection named "Props" to D:/Exports/ results in D:/Exports/Props.msh). Any collection named "bone shapes" is ignored and not exported when present. This is done to prevent the exportation of bone shapes as new .msh files. BN Export: Similar to MSH Export (uses selected armature name or collection name). Writes both .bn and .bbx files (e.g., skeleton.bn, skeleton.bbx). ANI Export: Uses the Blender Action name as the filename in the selected directory (e.g., an action named "Walk_Cycle" exports as Walk_Cycle.ani). Current Limitations / Disclaimer: BSP Export is DISABLED: While the addon includes the code for that, the operator to export a full .bsp map (including geometry, materials, entities, and baked lightmaps) is currently disabled in this release. BSP export is extremely complex, and this feature is incomplete. Performance: Importing very large maps or exporting complex scenes may take time due to Python processing. You can see the importing progress if you've opened Blender's console before importing a map. R3M Effects: While many material effects are replicated using shader nodes, perfect 1:1 visual parity with the original D3D8 fixed-function pipeline can be challenging. MSH exporter does not export effects currently. Download Link: https://github.com/Cardboard-box-a/cbb-rf-online-addon (download the repo as a zip), or the file embed here. Bug Reports/Suggestions: [The github's Issue page might be more suitable for keeping tracking of possible issues] Overall the import part of the addon expects that you are using it to import files from a real game client, with the original folder structure. Meshes, for example, can be imported without their associated textures if the original folder structure is not present. The .MSH exporter splits meshes that have more than 65k vertices automatically which has been tested by the .msh importer itself, but actual experience in the game is welcome to be known. Uploaded in this post itself is a zip containing ImHex patterns for some of the file formats I've worked on. Hopefully this addon will prove useful for creating custom content for such an old game, or at least to satisfy the curiosity of what the game looks like behind the curtains. Patterns.zip1 point -
Bumping this, if anyone would be an absolute unit to solve the animations it would be greatly appreciated! 🙃1 point
-
Did you ever figure out the animations format? I'd love to get access to the animations for some stuff but of course, the MOT files are formatted differently 😔1 point
-
use my plugin for Noesis arc_zlib_plzp_lang_vfs.py (which I mentioned earlier) it recursively unpacks all files, at the output you will get *.png, *.wav, *.pm3, *.vram, *.text, *.pvr and e.t.c You can also find a link to the plugin for 3D models *.vram above in the same topic. (*.pvr can open in PVRTexTool)1 point
-
I've written a Noesis script to make things easier for you tex_CarsPST.py Put the script in <NoesisDirectory>\plugins\python Some of the PST textures you sent has a mip levels more than one but due to how Noesis works I decided to only handle the highest one (Or probably I just don't know a proper way to do it anyway they were ordered from low to high). Also I assume every PST textures is palettized 8-bit image?1 point
-
zlib_DeCompressor.pyHere the DeCompressor update, now works with every file1 point
-
Mostly, PNG files are decompressed, so it's fairly easy to edit and reimport them. However, you need to compress your PNG files using the site iLoveIMG. For example, if the original PNG is 10 KB, your PNG must be 10 KB or less, so you will need to compress it on the site. I’ve attached 3 files: Two BMS scripts: One script will unpack the data, decompressing all files. The other script will unpack the file without decompressing it (this is the one you should use for reimporting; reimport with -r, not reimport 2 in quick bms). A Python script (.py): This program decompresses and compresses zlib files individually. I have set a compression level to reduce the file size even further. Use this if you need to handle compressed files. zlib_DeCompressor.py BMS.ZIP1 point
-
for fgo's script, you just need run this: python FGOArcade-FARC.py "your farcfile path" for farcpack tool, Run it in the shell to see the cli commands.1 point
-
yeah i guess so. but I think we need to find something to differentiate between rotate data or transform data, and it's hard to find that, so we need something reverse work1 point
-
for fgo arcade, you can use this to extract the farc file: https://github.com/Silvris/RandomScriptsAndTemplates/blob/main/FGO Arcade/FGOArcade-FARC.py for kancolle arcade, you can use farcpack tools to extract it: https://github.com/blueskythlikesclouds/MikuMikuLibrary/releases/download/v2.2.0/FarcPack.7z the trading card images was in ./rom/trading_card1 point
-
Has anyone managed to extract the trading card images? I tried using the script for the 3D models. but it just doesn't work.1 point
ResHax.com: Empowering Curious Minds in the World of Reverse Engineering
Delving into the Art of Code Unraveling: ResHax.com - Your Gateway to the Thrilling World of Reverse Engineering, Where Curiosity Meets Innovation!