Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 10/25/2025 in all areas

  1. I'm trying my best to make it load somehow
    4 points
  2. Actually the LZSS provide above, is wrong, for the files. I did the reverse enginner of the algorithim, Try the tool, see if the image get right TenchuWoH_DeCompressor.zip
    3 points
  3. Version 1.0.0

    17 downloads

    Broken Sword - Shadow of the Templars (1996) * bs1_1996_clu_export.py * bs1_1996_clu_import_and_patch.py Required: Python. When installing, make sure to check “Add python.exe to PATH.” Usage: * Copy the swordres.rif and text.clu files into the same folder as the .py files. * Run bs1_1996_clu_export.py: this will extract the texts into Text_exported.txt. * Translate it, then rename the finished file to Text_translated.txt. * Run bs1_1996_clu_import_and_patch.py: this will insert the translated texts back into text.clu and modify swordres.rif. * The new files will be created with the _new suffix. Tested with GoG (2.0.0.8) and Steam Reforged Free DLC FEARka
    3 points
  4. Thanks for some info from here and made a tool for unpacking and packing localize map files, if someone is interested in it. https://github.com/dest1yo/wwm_utils
    2 points
  5. Version 1.1

    498 downloads

    Tools for Battlefield 6 beta. Currently supports dumping the game, export models/maps. Usage is similar to previous tools for frostbite engine. toc_bf6.exe - dump tool Change .ini file parameters: - game path - dump path - selection to dump "ebx", "res", "chunks" or "all" Then drop any .toc file onto .exe to dump assets. Or run from command line with 1 parameter - toc file name. Fb_bf6_mesh.exe - model tool Takes .MeshSet as parameter. ske_soldier_3p.ebx - main universal skeleton for soldiers. Must be in the same folder. If you need another skeleton, use its name as 2nd parameter. Or rename it to ske_soldier_3p.ebx. Tool will try to find chunks automatically. If not, it gives error message with chunk name. Map export 1. Create database Run fb_maps_bf6_db.exe tool once, it will scan whole dump for meshsets and blueprints, so later maps can be converted fast, without the need to go into whole tree of assets. This will take a few minutes. After that, 2 files will be created: bp.db & meshnames.txt, which need to stay in the same folder with EXE for main tool to work. 2. Export maps Use fb_maps_bf6.exe (main map tool) to convert maps. Drop any EBX on it, use in command line with 1 parameter, or create a batch. 3. Terrain export Main terrain data is in .TerrainStreamingTree files for each level. For some levels, these files are small, which means the actual data is in chunks. Sometimes data is in the file itself, in this case it may be big, about 50mb in size. Drop .TerrainStreamingTree on fb_terrain_bf6.exe or use command line.
    2 points
  6. models.zip Here's the map FBX files I got.
    2 points
  7. Let's just say Im remaking a certain game. But in VR
    2 points
  8. The game have update and they hard-coded new text in .mpk lua script, because some words have many different meaning depend on the context. With packet sniffing, i observed that the game download some .pak file from easebar.com and put them in .mpk file. These file are encrypted.
    1 point
  9. Tutorial which describes how to export-import Horizon Forbidden West game assets by use of id-daemons HFW import tool: Required programs/tools: id-daemons HFW import tool Blender 3.6 ASCII Import-Export Blender Add-on (HFW) HFW Export-Import Tutorial.pdf
    1 point
  10. I made a blender addon to import models, textures and animations for dolphin wave and other games that used the same engine. it can import lzs and lza files as is. You don't need to decrypt or decompress the files https://github.com/Al-Hydra/blenderBUM
    1 point
  11. Here my analysis: Header: 24 bytes: [ Int64 EntryCount Int64 ValueCount Int32 Timestamp Int32 Padding ] Buckets: [24-528] bytes, based on allocated bucket TableEntries: EntryCount * [ 8 Bytes Hash(or id?), Int32 RelativeOffset, (formula: text_start = current_entry_offset + 8 + value) Int32 TextLength ] Values: ValueCount * [ Byte[ValueLength] Data ] Null value have zero length and no hash. Successfully unpack and pack, the game load new text normally.
    1 point
  12. thanks 2001_800xcsled.zip 1999_440xcrsled.zip
    1 point
  13. make some ajustments! now its working
    1 point
  14. When i get home, i will compile the decompressor/compressor unpack and pck tool, is one all tool. std::vector<uint8_t> compressLZSSBlock(const std::vector<uint8_t>& input) { const int MIN_MATCH = 3; // comprimento mínimo para virar par const int MAX_MATCH = 17; // (0xF + 2) const int DICT_SIZE = 4096; const size_t n = input.size(); // Dicionário igual ao do descompressor std::vector<uint8_t> dict_buf(DICT_SIZE, 0); size_t dict_index = 1; // mesmo índice inicial do descompressor size_t producedBytes = 0; // quantos bytes já foram "gerados" (saída lógica) std::vector<uint32_t> flagWords; uint32_t curFlag = 0; int bitsUsed = 0; auto pushFlagBit = [&](bool isLiteral) { if (bitsUsed == 32) { flagWords.push_back(curFlag); curFlag = 0; bitsUsed = 0; } if (isLiteral) { // bit 1 = literal (mesmo significado do descompressor) curFlag |= (1u << (31 - bitsUsed)); } ++bitsUsed; }; std::vector<uint8_t> literals; std::vector<uint8_t> pairs; literals.reserve(n); pairs.reserve(n / 2 + 16); size_t pos = 0; while (pos < n) { size_t bestLen = 0; uint16_t bestOffset = 0; if (producedBytes > 0) { // tamanho máximo possível para este match (não pode passar do fim do input) const size_t maxMatchGlobal = std::min(static_cast<size_t>(MAX_MATCH), n - pos); // percorre todos os offsets possíveis do dicionário for (int off = 1; off < DICT_SIZE; ++off) { if (dict_buf[off] != input[pos]) continue; // --- SIMULAÇÃO DINÂMICA DO DESCOMPRESSOR PARA ESTE OFFSET --- uint8_t candidateBytes[MAX_MATCH]; size_t candidateLen = 0; for (size_t l = 0; l < maxMatchGlobal; ++l) { const int src_index = (off + static_cast<int>(l)) & 0x0FFF; // valor em src_index, levando em conta que o próprio bloco // pode sobrescrever posições do dicionário (overlap) uint8_t b = dict_buf[src_index]; // Se src_index for igual a algum índice de escrita deste MESMO par // (dict_index + j), usamos o byte já "gerado" candidateBytes[j] for (size_t j = 0; j < l; ++j) { const int dest_index = (static_cast<int>(dict_index) + static_cast<int>(j)) & 0x0FFF; if (dest_index == src_index) { b = candidateBytes[j]; break; } } if (b != input[pos + l]) { // não bate com o input, para por aqui break; } candidateBytes[l] = b; ++candidateLen; } if (candidateLen >= static_cast<size_t>(MIN_MATCH) && candidateLen > bestLen) { bestLen = candidateLen; bestOffset = static_cast<uint16_t>(off); if (bestLen == static_cast<size_t>(MAX_MATCH)) break; // não tem como melhorar } } } if (bestLen >= static_cast<size_t>(MIN_MATCH)) { // --- CODIFICA COMO PAR (offset, length) --- pushFlagBit(false); // 0 = par uint16_t lengthField = static_cast<uint16_t>(bestLen - 2); // 1..15 uint16_t pairVal = static_cast<uint16_t>((bestOffset << 4) | (lengthField & 0x0F)); pairs.push_back(static_cast<uint8_t>(pairVal & 0xFF)); pairs.push_back(static_cast<uint8_t>((pairVal >> 8) & 0xFF)); // Atualiza o dicionário exatamente como o DESCOMPRESSOR: // for (i = 0; i < length; ++i) { // b = dict[(offset + i) & 0xFFF]; // out.push_back(b); // dict[dict_index] = b; // dict_index = (dict_index + 1) & 0xFFF; // } for (size_t i = 0; i < bestLen; ++i) { int src_index = (bestOffset + static_cast<uint16_t>(i)) & 0x0FFF; uint8_t b = dict_buf[src_index]; dict_buf[dict_index] = b; dict_index = (dict_index + 1) & 0x0FFF; } pos += bestLen; producedBytes += bestLen; } else { // --- LITERAL SIMPLES --- pushFlagBit(true); // 1 = literal uint8_t literal = input[pos]; literals.push_back(literal); dict_buf[dict_index] = literal; dict_index = (dict_index + 1) & 0x0FFF; ++pos; ++producedBytes; } } // Par terminador (offset == 0) pushFlagBit(false); pairs.push_back(0); pairs.push_back(0); // Flush do último flagWord if (bitsUsed > 0) { flagWords.push_back(curFlag); } // Monta o bloco final: [u32 off_literals][u32 off_pairs][flags...][literais...][pares...] const size_t off_literals = 8 + flagWords.size() * 4; const size_t off_pairs = off_literals + literals.size(); const size_t totalSize = off_pairs + pairs.size(); std::vector<uint8_t> block(totalSize); auto write_u32_le = [&](size_t pos, uint32_t v) { block[pos + 0] = static_cast<uint8_t>(v & 0xFF); block[pos + 1] = static_cast<uint8_t>((v >> 8) & 0xFF); block[pos + 2] = static_cast<uint8_t>((v >> 16) & 0xFF); block[pos + 3] = static_cast<uint8_t>((v >> 24) & 0xFF); }; write_u32_le(0, static_cast<uint32_t>(off_literals)); write_u32_le(4, static_cast<uint32_t>(off_pairs)); size_t p = 8; for (uint32_t w : flagWords) { block[p + 0] = static_cast<uint8_t>(w & 0xFF); block[p + 1] = static_cast<uint8_t>((w >> 8) & 0xFF); block[p + 2] = static_cast<uint8_t>((w >> 16) & 0xFF); block[p + 3] = static_cast<uint8_t>((w >> 24) & 0xFF); p += 4; } std::copy(literals.begin(), literals.end(), block.begin() + off_literals); std::copy(pairs.begin(), pairs.end(), block.begin() + off_pairs); return block; } @morrigan my compressor, try it, and let me know the results.
    1 point
  15. In the ..var01.st2 csv data is contained: edit: and xml data: <!-- ______________________________________________________________________________ Copyright 2004 The Collective, Inc. DISMEMBERMENT DEFINITION Character: Clone Trooper Author: Baback Elmieh Date: 01/07/2004 ______________________________________________________________________________ --> <!-- HEAD --> <DismemberablePart Name="Head" Hitpoints="25"> <!-- The Materials section is a list of materials in the original mesh that are to be turned off when the part is dismembered --> <!-- <Materials> <Material Name="headSG"/> </Materials> --> <!-- ReactionProcessing defines the chunks and particles to be spawned when a reaction dismemberment is processed for the character the definition requires a Bone from which a chunk should be spawned and the name of the chunkmesh. The ChunkMesh definition in turn can have several values set such as GravityScale and UseGinFile. GravityScale greater than 1.0 pulls a chunk down faster, UseGinFile will look for a bounding box with the same name as the chunkmesh in the damage mesh's gin file, if the bound is found, it is used instead of the default rendering bound which can help artists orientate chunks so that they land on their correct side --> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="neck_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <!-- <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="neck_g"> <Param Name="ChunkMesh" Value="head"/> <Param Name="GravityScale" Value="1.4"/> <Param Name="RandomVelocityScale" Value="0.2"/> </Param> </Chunk> --> </ReactionProcessing> <!-- The Capsules section provides a list of capsules that should affect the hitpoint of the part and should be disabled once the chunk has been dismembered --> <Capsules> <Capsule Name="Dneck_g"/> </Capsules> </DismemberablePart> <!-- LEFT SHOULDER --> <DismemberablePart Name="Left Shoulder" Hitpoints="25"> <Materials> <Material Name="Shoulder_LSG"/> </Materials> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="shoulder_L_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="shoulder_L_g"> <Param Name="ChunkMesh" Value="Shoulder_L"/> <Param Name="GravityScale" Value="1.8"/> <Param Name="RandomVelocityScale" Value="0.1"/> </Param> </Chunk> </ReactionProcessing> <Capsules> <Capsule Name="Dshoulder_L_g"/> </Capsules> </DismemberablePart> <!-- RIGHT ELBOW --> <DismemberablePart Name="Right Shoulder" Hitpoints="25"> <Materials> <Material Name="Elbow_RSG"/> </Materials> <ReactionProcessing> <!-- particles --> <Particles> <!-- spark particle from the joint --> <Param Name="BoneEmission" Value="shoulder_R_g"> <Param Name="ParticleID" Value="IDS_FX_LIGHTSABER_BODY_IMPACT"/> </Param> </Particles> <!-- chunks --> <Chunk Typename="TSingleChunk"> <Param Name="Bone" Value="shoulder_R_g"> <Param Name="ChunkMesh" Value="Elbow_R"/> <Param Name="GravityScale" Value="1.8"/> <Param Name="RandomVelocityScale" Value="0.1"/> </Param> </Chunk> </ReactionProcessing> <Capsules> <Capsule Name="Dshoulder_R_g"/> </Capsules> </DismemberablePart> </DismembermentDefinition>
    1 point
  16. .ilv.txth: codec = PSX channels = 2 sample_rate = 44100 interleave = 0x4000 num_samples = data_size
    1 point
  17. You can either use this QuickBMS script to extract the msv audio files out of the rp2: get UNK long get FILES long goto 0x20 for i = 0 < FILES getdstring NAME 7 getdstring DUMMY 25 get OFFSET long get SIZE long get DUMMY2 long string NAME + ".msv" log NAME OFFSET SIZE next i Or you can use this txth file to play the audios out of the rp2 directly (needs vgmstream + an audio player like foobar2000): subsong_count = @0x04 subsong_spacing = 0x2c base_offset = 0x20 name_offset = 0x00 subfile_offset = @0x20 subfile_size = @0x24 subfile_extension = msv Save the text above as ".rp2.txth" and put it on the same directory as the rp2 file. Also if you're using foobar2000, make sure to check "Enable unknown exts" on the vgmstream preferences page.
    1 point
  18. Hi Shak-otay, yes it's me, I never imagined you would remember me 😄. I found this texture in the file.
    1 point
  19. Bumping this again because I really don't want this thread to quietly die, as it seems the edits to my message are not enough to constitute a bump. Every single possible 16 bit float format I've tried does not work. Indicating this is some proprietary cursed format. Maybe a LUT. Maybe encrypted. Maybe something else. Which probably explains why the .mot files still have not been decrypted all these years. I do suspect what certain bits mean but I am really unsure. I have the model .bin and some other examples of .mot in hand as well so if you would like me to send it I will gladly do so. Just note I do need these files decrypted for a project so I would like this done as fast as possible it would be nice. What I do know is that this is little endian. Z-Y-X order. I have no clue what else. Help is much appreciated please 🙏 (I am not sure what y'all want but I am interested in a way to export the .mot to .csv with a Frame # column. For my application, that is enough.)
    1 point
  20. I did the descompressor and compressor but in c++, need to test the compressor thought, because is compressing better than the original. Test in game and show the results, if works. MACROSS_PS1_TOOL.zip
    1 point
  21. What I'm trying to do is create a new texture and trying to add an alpha channel while hex editing. The Zip File I provided is mostly for demonstration purposes. I'm trying to edit NINJA_FACE_DAMAGE_000, NINJA_FACE_DAMAGE_001, and NINJA_FACE_DAMAGE_002. My biggest problem is that the textures I'm trying to add an alpha channel to is different than it's original textures and is swizzled differently thus the alpha channel will be different. I am aware the image data will always start at 592 (0x250) and palette data differs depending on the size of the texture 64x64 (0x12A0), 128x128 (0x42A0), 256x256 (0x102D0). Anyways, I was able to use ImageHeat to get an alpha channel from the original textures with PAL8 pixel and RGBA8888 palette and exported it. Also, was able to use the ReverseBox Demo 2 to export and import the original textures just to get familiar with it. These are just a few textures that I'm trying to insert into the game and the alpha channels. MKD Texture Edit.zip
    1 point
  22. Here's a sample model for one of the enemies in the game. Notice that "*_div.msb" can't be view/export properly for some reason, but the base one did just fine. PSVitaSample.zip
    1 point
  23. fmt_psaVita_ValkyrieDrive.py Here's a old noesis plugin to view and export most of the mib, msb and mab of the PSVisa version of the game.
    1 point
  24. Edit - just tested it and no 4 mrts is uv, you was right in saying the 4th one is the uv maps by the rule
    1 point
  25. # Atlas Fallen (Fledge Engine) # script for QuickBMS http://quickbms.aluigi.org comtype lz4f open FDDE "toc" get DUMMY long get SOME_CRC long callfunction GET_INFO 1 get SIZE long endian big get SOME_CRC long callfunction GET_INFO 1 get SOME_CRC long callfunction GET_INFO 1 callfunction GET_INFO 1 get DUMMY byte # 1 get DUMMY byte # 1 get DUMMY byte # 1 get DUMMY long # 1 get FILES long string NAME p "%s_%d.dat" NAME 0 open FDSE NAME 1 for i = 0 < FILES get SOME_CRC long callfunction GET_INFO 1 get OFFSET longlong get ZSIZE long get SIZE long get DUMMY long get DUMMY long get DUMMY short clog NAME OFFSET ZSIZE SIZE 1 next i startfunction GET_INFO get ZERO long get DUMMY long savepos TMP get NAMESZ long if NAMESZ & 0x80000000 math NAMESZ & 0x7fffffff endif getdstring NAME NAMESZ padding 4 0 TMP endfunction
    1 point
  26. I have one as an example I notice in this section Ran XXD with groups of 12 and noticed a pattern. This is the pinky intermediate joint. It is known that the pinky intermediate joint has one degree of freedom. meaning that movements in the other two DoF should be minimal. The second set of each float is stable. 16 bit floats. Little endian likely. Meaning that [3f], [00], [ff], etc. is the major bits. Given the ffs I do not think it is Big Endian with an offset. I have attached the file in question so you can look yourself EDIT: These are signed LE numbers. Circular angular floats (not IEEE 754 standard) so ffff/0000, 3fff/4000, 7fff/8000, bfff/c000 are each 90 degrees apart. EDIT 2: It could also be a LUT. But then I checked for any tables and I can't find any useful. EDIT 3: I have no clue anymore. These are proprietary obfuscated numbers using some cryptic format and if anyone knows how to decode them it would be absolutely amazing. SVT_0015_S01_ATK_A01.zip
    1 point
  27. I've been doing a Noesis script for the beta of Once Human. Still got a few things to do on it, but it should work for most of the models so far: Edit: Read the notes at the start of the script regarding the various files needed. once_human_mesh.zip
    1 point
  28. I know I’ve already posted my progress with materialising Mike Tyson but I thought I’d show these also as I managed to convert the normals from green to purple which for me personally and unity mods it works better
    1 point
  29. I've just released a new version of ImageHeat 🙂 https://github.com/bartlomiejduda/ImageHeat/releases/tag/v0.31.2 Changelog: - Added new pixel formats: APLHA4, ALPHA4_16X, ALPHA8, ALPHA8_16X, RGBA6666, RGBX6666, BGRT5551, BGRT8888, PAL8_TZAR, BGRA5551, BGRA5551_TZAR, BGRA8888_TZAR, BGRA4444_LEAPSTER - Added support for LZ4, Emergency RLE, Neversoft RLE, Tzar RLE, Leapster RLE, Reversed TGA RLE - Fixed issue with x360 swizzling - Fixed issue with PS Vita/Morton swizzling for 4-bpp images - Added support for palette values scaling (1x, 2x, 4x, 8x, 16x) - Added dropbox for palette scaling in "Palette Parameters" box - Added funding info
    1 point
  30. @ikskoks Thank you for this script, I also was able to use it successfully. I had a question about the outputted `.kiw` files, there is bytecode in each file that corresponds to in game instructions like character dialogue expressions, choices, background images and music cues. What is the best way to reverse engineer the bytecode in these files? I am not sure of the game engine that made this but are there any existing parsers or scripts for `.kiw` or similar you know of?
    1 point
  31. My script for another game should work with these GSB files: https://github.com/DKDave/Scripts/blob/master/QuickBMS/GameCube/Legend_Of_Spyro_New_Beginning_(GameCube)_GSB.bms
    1 point
  32. Sorry, my bad, now i see, list look similiar how my old
    1 point
  33. Because the fmlb and sound file does exits anymore because when before Game shutdown that files are dynamic content but some files like that are available in beta versión APK and obb but no all files
    1 point
  34. I am updating the Noesis script from this post to handle more versions of the 3D model rsf format. Soon I will release an update to the script. Meanwhile, as a sample, I want to show some extracted models that we were not able to extract with the old script. On the top left corner we find the Orange Bowl stadium from NCAA 08. On the top right corner we see the hologram (a ficticious one) stadium from Madden 13. The low left corner shows the LA Memorial Coliseum from Madden 12 or 13 and finally on the low right corner we find the Louisiana Tech stadium from NCAA 12. All these rsf files come from the PS3 versions of the games with data in big endian. The script can handle data in little endian too, for example rsf files coming from PS Vita games. I am almost 100% sure that the updated script will be able to handle rsf files from NCAA 08 to NCAA 14 and Madden 07 to Madden 17 with no issues.
    1 point
  35. You can use my latest Fmod Bank Tools - https://www.nexusmods.com/rugbyleaguelive3/mods/2 https://github.com/Wouldubeinta/Fmod-Bank-Tools Just PM me for the bank password.
    1 point
  36. why don't you just connect the vertices in a 3d editor. the plugin opens the models, that's all you need. the rest you can work on yourself
    1 point
  37. 3 downloads

    This is attachment from ZenHAX posted by petventh in topic: JX3 HD Remake ??3 (.dat files)
    1 point
  38. Bumping this, if anyone would be an absolute unit to solve the animations it would be greatly appreciated! 🙃
    1 point
  39. Does anyone know more about the mcd format used by EA? Models are packed inside the rsf format and it is possible to extract them with the rsf.bms script. I managed to make a Noesis script to view the models, but reading the skeletons is hard for me and I don't know how to proceed with them. Vertex weights are stored in a table that starts with the tag TIEW in mcd and skeleton data in FRGS in .skel file. I would appreciate any kind of help to import the skeletons into noesis. Preview in Noesis: https://imgur.com/a/o2dQvCl fmt_NHL21_mcd.py rsf.bms.zip nhl21model.zip
    1 point
  40. my plugin for vfs work with your file EDIT: and i made preview plugin for *.sggr fmt_sggr.py (*.pvr its image, use pvrTexTool)
    1 point
  41. Seems the game dont accepts a different zlib levels Maybe using levels 0-9 and try. use level 9, the compression file will be the same as original! https://drive.google.com/file/d/11rON0JaDswJCQJ-RBF2USKErQRtPbP_I/view?usp=sharing and maybe solution post;
    1 point
  42. Did you ever figure out the animations format? I'd love to get access to the animations for some stuff but of course, the MOT files are formatted differently 😔
    1 point
  43. I've written a Noesis script to make things easier for you tex_CarsPST.py Put the script in <NoesisDirectory>\plugins\python Some of the PST textures you sent has a mip levels more than one but due to how Noesis works I decided to only handle the highest one (Or probably I just don't know a proper way to do it anyway they were ordered from low to high). Also I assume every PST textures is palettized 8-bit image?
    1 point
  44. Mostly, PNG files are decompressed, so it's fairly easy to edit and reimport them. However, you need to compress your PNG files using the site iLoveIMG. For example, if the original PNG is 10 KB, your PNG must be 10 KB or less, so you will need to compress it on the site. I’ve attached 3 files: Two BMS scripts: One script will unpack the data, decompressing all files. The other script will unpack the file without decompressing it (this is the one you should use for reimporting; reimport with -r, not reimport 2 in quick bms). A Python script (.py): This program decompresses and compresses zlib files individually. I have set a compression level to reduce the file size even further. Use this if you need to handle compressed files. zlib_DeCompressor.py BMS.ZIP
    1 point
  45. Maybe I didn’t quite understand your goal, but you can’t just take files from the game and compile them back into the game.) Edit: ah, I read a few posts above, you need to compile Asset Studio, not the game files.
    1 point
  46. for fgo's script, you just need run this: python FGOArcade-FARC.py "your farcfile path" for farcpack tool, Run it in the shell to see the cli commands.
    1 point
  47. for fgo arcade, you can use this to extract the farc file: https://github.com/Silvris/RandomScriptsAndTemplates/blob/main/FGO Arcade/FGOArcade-FARC.py for kancolle arcade, you can use farcpack tools to extract it: https://github.com/blueskythlikesclouds/MikuMikuLibrary/releases/download/v2.2.0/FarcPack.7z the trading card images was in ./rom/trading_card
    1 point
  48. Guys just close this post make no sense, The forum is new in 3d category are 260 post and 4 of this are with the same title "please help .......3dunity". Please search first in forum if exist a answer next make post. Its more easy if collect all the info for type of file in 1 place, honesty the old xentax have guys who redirect to main post if some1 go like this for every little thing. And I see the OP make 3 post in row 5 min from each other for almost the same thing. Just start over this obvious goin anywhere.
    1 point
×
×
  • Create New...