Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation since 11/20/2025 in Posts

  1. This looks correct. The palette is RGB5551, but is swizzled with the same method as the image (not the standard PS2 palette shift). Patrick should be able to confirm if that can be added to ImageHeat as a palette swizzle option.
    3 points
  2. I don't know if there are more models in that unpacked folder, you need to check that so examine each file there. Just remember that characters use shorts in vertices buffer, I think I saw other file with floats but maybe that file is not a character or maybe it is but with floats, I really don't know, lol. Here is the script if you want to test it: fmt_black_ps2_prototype_DB.py
    3 points
  3. You'd be better using ImageHeat, as it has more options for swizzling, etc. For these, the image is PS2 swizzled, but I can't work out the palette. It doesn't seem to use any of the standard PS2 palette formats, maybe a different swizzling method. This is from "t_compact_rockangelz_closed_00000003":
    2 points
  4. The "repeated offsets" pointing to new blocks indicate that the main BIGFILE.CAT is acting as a master container that holds smaller, self-contained archives inside it. The Master Index: Points to large chunks of data (e.g., "Level 1 Data", "Level 2 Data"). The "New Block": When you go to that offset, you find a new header (signature 01 00 01 00). The Inner Index: This new header has its own list of files. Because this block is treated as a standalone file by the game engine once loaded, its offsets start at 0 (relative to the start of that block), not relative to the start of the whole disc. [ MASTER CAT (BIGFILE) ] |-- Header |-- Index Entry 1: Offset 1000 -> Points to "Level 1 Block" |-- Index Entry 2: Offset 5000 -> Points to "Level 2 Block" | |... [Data at Offset 1000] ... | +-> [ NESTED CAT (Level 1) ] |-- Header (starts at Master Offset 1000) |-- Index Entry A: Offset 10 (Absolute: 1010) |-- Index Entry B: Offset 50 (Absolute: 1050) |-- Data... Why did developers do this? (The Logic) This approach was necessary due to the hardware limitations of the PlayStation 1 (PS1): RAM Constraints: The PS1 has only 2MB of RAM. It cannot keep a massive table of thousands of file offsets in memory at all times. Modular Loading: The game loads the "Master Index" to find the location of the current level's data. It then streams that specific "Block" (Nested CAT) into memory. Relative Addressing: Once the "Block" is loaded into a specific memory address, the game engine reads the inner offsets. Since these offsets are relative to the start of the block (0), the engine can easily calculate memory pointers without needing to know where the block was originally located on the CD.
    2 points
  5. 2 points
  6. I've just released new version of ImageHeat 🙂 https://github.com/bartlomiejduda/ImageHeat/releases/tag/v0.39.1 Changelog: - Added new Nintendo Switch unswizzle modes (2_16 and 4_16) - Added support for PSP_DXT1/PSP_DXT3/PSP_DXT5/BGR5A3 pixel formats - Fixed issue with unswizzling 4-bit GameCube/WII textures - Added support for hex offsets (thanks to @MrIkso ) - Moved image rendering logic to new thread (thanks to @MrIkso ) - Added Ukrainian language (thanks to @MrIkso ) - Added support for LZ4 block decompression - Added Portuguese Brazillian language (thanks to @lobonintendista ) - Fixed ALPHA_16X decoding - Adjusted GRAY4/GRAY8 naming - Added support section in readme file
    2 points
  7. The textures are compressed with ZSTD - just that type 0 means the whole file is not compressed. But there doesn't seem to be any encryption once decompressed - looks something like ETC format:
    2 points
  8. Thanks for some info from here and made a tool for unpacking and packing localize map files, if someone is interested in it. https://github.com/dest1yo/wwm_utils
    2 points
  9. It's been a while since this topic is up and i have found a way to deal with this: -Step 1: From the .farc files, use either the tool mentioned at the first post of this thread, or download QuickBMS and use the virtua_fighter_5 bms script i included in the zip file below to extract them into bin files. -Step 2: Download noesis and install the noesis-project-diva plugin (https://github.com/h-kidd/noesis-project-diva/tree/main , or in the included zip file) in order to view and extract the textures/models and use them in Blender or a 3d modeling software of your choice. KancolleArcade.zip
    2 points
  10. The game has a debug mode, maybe this will help in some way.
    1 point
  11. Sometimes there's a 2nd uv channel. Do you mean that? (In another case one could also double special vertices to get the same amount like uvs but I forgot how to do that... iterate through faces, somehow. edit: found it, answer from Daniol Dan, ""thus each vertex has its own single UV coord"" Not sure whether this will apply to your problem, though.
    1 point
  12. Today I am gonna show you, how to reverse eningeer any Binary 3D Models, turns out this is not that hard and actually one of the cooolest things in reverse enigneering! (Uncompressed and un-encrypted models obviously). +====TUTORIAL SECTION=====+ INTRODUCTION But how do all those models store their 3D Data? Well, the answer is simple, there is no magic here, All 3D Models are just made up of *Vertecies*, *Faces*, *Vertex UV Coordinates* and *Vertex Normal Coordinates* They are definatelly *must* somewhere there in your file (this place is called buffer) and there is absolutelly no extra magic in here. This is how the Vertecies look like: v 1.0 4.0 3.0 <= X, Y, Z matrix coordinates (usually from 0.01 to 1000) v 2.0 3.0 4.0 <= Point values so are usually floats v 6.0 2.0 3.0 <= Usually stable, values don't varry to much in max and min values This is how faces looks like: f 1 2 3 <= Takes all those previous vertecies and makes a triangle out of them This is how UV Vertex coords look lke: vt 0.2 0.3 <= 2D coordinate of the first vertex (usually from 0.1 to 1.0) vt 0.5 0.2 <= Point values so are usually floats vt 0.3 0.1 <= Usually stable, values don't warry to much in max and min values This is how Vertex normals look like: [not so important actually] vn 0.745 0.845 0.360 <= X, Y, Z matriz coordinates (usually from 0.01 to 1) vn 0.320 0.625 0.270 <= Point values so are usually floats, so "v2 x, y, z" vn 0.430 0.320 0.390 <= Usually stable, values don't warry much in max and min values The result is a simple triangle that has it's own UV Map too. This is how the simplest 3D Model format OBJ stores their 3D Model data, hovewer we can say that all of the binary models store their 3D Data in OBJ file format there is just one more thing to it. Binary formats have only two ways of storing their 3D Data (Aside faces) in a Separate way and Structured way, here is how it looks like: Separate way: vertex_buffer = [ v1 1.0 4.0 3.0 <= X, Y, Z matrix coordinates (usually from 0.01 to 1000) v2 2.0 3.0 4.0 <= Point values so are usually floats, so "v2 x, y, x" v3 6.0 2.0 3.0 <= Usually stable, values don't varry to much in max and min values ... ] face_buffer = [ f1 1 2 3 <= Takes all those previous vertecies and makes triangle out of them, so "f1 v1, v2, v3" ... ] uv_coords_buffer = [ vt1 0.2 0.3 <= 2D coordinate of the first vertex (usually from 0.1 to 1.0) vt2 0.5 0.2 <= Point values so are usually floats, so "vt2 u, v" vt3 0.3 0.1 <= Usually stable, values don't warry to nuch in max and min values ... ] vertex_normals_buffer = { vn1 0.745 0.845 0.360 <= X, Y, X matrix coordinates (usually from 0.01 to 1) vn2 0.320 0.625 0.270 <= Point values so are usually floats, so "v2, x, y, z" vn3 0.450 0.310 0.390 <= Usually stable, values don't warry much in max and min values ... } Structured way: buffer = [ {v1 1.0 4.0 3.0, vt1 0.2 0.3, vn1 0.745 0.845 0.360} {v2 2.0 3.0 4.0, vt2 0.5 0.2, vn2 0.320 0.625 0.270} {v3 6.0 2.0 3.0, vt3 0.3 0.1, vn3 0.450 0.310 0.390} ... ] BINARY DATA The data in each file can be viewed as binary no matter if it was readable or unreadable or even empty before, viewing it in binary will spoil immediatelly everything. And while binary files are all the same, the way we read it changes drastically everything! To view your binary file yiou must dump HEX from it or load it into HEX Viewer: Example file: Addres: HEX Bytes: ASCII: 0012BFC0 48 53 68 61 70 65 5F 31 37 00 00 00 00 00 01 00 HShape_17....... <= First line contains ASCII strings 0012BFD0 00 00 0A 00 00 00 22 00 00 10 00 00 00 00 0C 00 ......"......... <= Second line does not contain ASCII strings 0012BFE0 00 00 61 32 76 2E 6F 62 6A 43 6F 6F 72 64 01 00 ..a2v.objCoord.. <= Third line contains ASCII strings 0012BFF0 00 00 FF FF FF FF 02 00 00 00 47 04 00 00 82 56 ..........G....V <= Fourth line contains interesting "00 00 FF FF FF FF" buffer mark 0012C000 F9 40 39 94 59 43 76 26 13 41 BB 61 FB 40 5A A4 [email protected]&.A.a.@Z. <= Fifth line starts containg the actual float Vertex coordinates! But looks random in ASCII strings! 0012C010 5B 43 95 B7 00 41 8F 70 CB 40 C1 4A 5B 43 31 08 [[email protected][C1. <= Sixth line contains actual float Vertex coordinates! But looks random in ASCII strings! 0012C020 12 41 8A 8E C9 40 E7 5B 59 43 E8 82 1D 41 90 A0 .A...@.[YC...A.. <= Seventh line contains actual flaot Vertex coordinates! But looks still random in ASCII strings! 0012C030 62 40 21 90 58 43 05 DD 1C 41 BC B3 78 40 D7 63 b@[email protected] <= Eight line contains actual float Vertex coordinates! But looks again random in ASCII strings! But what are those floats, shorts and ASCII? The Bits are the smallest units of computer data they are either 0 or 1 and comma. The Bytes hgovewer is a combined 8 Bits that can actually start representing some data. Those are Bits ranging from 0 to 255, where 0 is also precieved as an important value (So 256 combinations), (I represented them in HEX, 0-F values, so a 256 combinations) Here is one Byte for example: 10110111 (32 16 8 4 2 1 = 256 bits as sum), combining Bytes together we can make multiple data types. This are all of the data types: Byte/Char => 1 Byte, unsigned/signed (8 Bits) |Example: 48 <= H | ASCII Word/Short => 2 bytes, unsigned/signed (16 Bits) |Example: 48 53 <= HS | ASCII Dword/Int => 4 bytes, unsigned/signed (32 Bits) |Example: 48 53 68 61 <= HShap | ASCII ULONG32/Long => 4 Byte, unsigned/signed (32 Bits) |Example: 48 53 68 61 <= HShap | ASCII ULONG64/Long Long => 8 Byte, unsigned/signed (64 Bits) |Example: 48 53 68 61 70 65 5F 31 <= HShape_17 | ASCII float => 4 bytes, for represnting floating point values (32 Bits) |Example: 48 53 68 61 <= HShap | ASCII double => 8 bytes, for representing more precise floating point values (64 Bits) |Example: 48 53 68 61 70 65 5F 31 <= HShape_17 | ASCII String/Char => A Sequence/Array of Characters terminated by the nulll character |Example: 48 53 68 61 70 65 5F 31 <= HShape_17 | ASCII Big-Endin vs Little-Endian: Reading in Big-Endian for example a float byte will read it normally, left-to-right 48 53 68 61 "HShap", where's Little-endin reads byte in reverse order, right-to-left 61 68 53 48 "paSH". Big-Endians were mainly used in PS3, Xbox360 and Wii platforms where Little-Endians are mainly in Windows, PS4, Xbox One, Nintendo Switch. TRYING TO REVERSE THE BINARY 3D FORMAT But how do we actually apply this info into reverse engineering the binary 3D file format structure and even converting it into an OBJ Model. Assuming that you have the actual decompressed/uncompressed and decrypted/unencrypted binary 3D model file, you can actually visualize the 3D Data geometry while analyzing the HEX from it in realtime! ModelResearcherUltimate is the program that will enable this opportunities. First of, Level 1: Start with vertecies count 500, type: float, carefully try different offsets while printing the values and render it too, until you see a countinous very stable output without insanelly big or small values. (from 0.001 to 1000). If nothing works try with different Endianess, then try a different type (unlikely). If the mesh appears but random vertecies appear too that means that the Data structure is sctructured and you need to try different Padding or even Pad inters sometimes. Second of, Level 2: Start with vertex UV coordinates count [exactly how many vertecies], type: float, carefully try different offsets while printing the values and rendering it too, until you see a countinous stable output without insanelyy big or small values (from 0.0001 to 1.) If nothing works try different type, since you already know the Endianes and Structure. Third of, Level 3: Start with faces, they are actually very carefully linked with vertecies, so the errors will constantly appear, carefully try different offsets while printing the values, don't render it, it will often just throw the errors. You will need see the full values without floating points that are very stable in output without big and small values, if nothing works try different type or even the format. Fourth of, Level 4: [To be honest I didn't know what to write here, normals are pretty useless though, you can just flip them and calculate, very easily in programs like Blender in just a few clicks, so it's not worth your brainstorming!] Practical steps: Here is how BAD Data will look like: [random, disoriented pattern, extreamly low and extreamly big values occur] v -0.0000 -0.0000 -184538016.0000 v -0.0000 15.7924 -158665664.0000 v -0.0000 90990377942005974930976407552.0000 -17551224.0000 v -0.0000 -3386287.2500 -115467744.0000 v -0.0000 15397417210601645679040601784320.0000 -22963316.0000 v -0.0000 15397417210601645679040601784320.0000 -22963316.0000 vt 0.0000 1785889664.0000 vt 0.0000 140283808776479363868647227392.0000 vt 0.0000 10997215558668704718782464.0000 vt 0.0000 -516472.2188 vt 0.0000 -0.0000 vt 0.0000 0.0000 f 57856 10240 3073 f 3073 64769 57856 f 31744 64768 3072 f 57857 64768 58112 f 57856 58112 58368 f 58112 59136 58368 Here is how GOOD data looks like: [strong countinous repating pattern, values are pretty much very similiar] v -0.0733 0.0012 1.6030 v -0.0735 -0.0118 1.6023 v -0.0776 -0.0146 1.5900 v -0.0718 -0.0247 1.6005 v -0.0784 0.0009 1.5913 v -0.0784 0.0009 1.5913 vt 0.0008 0.6221 vt 0.0316 0.6229 vt 0.0344 0.6543 vt 0.0628 0.6246 vt 0.0008 0.6539 vt 0.9978 0.6533 f 226 296 268 f 268 253 226 f 124 253 268 f 226 253 227 f 226 227 228 f 227 231 228 Changing Offfset (oftenly) or Endianess or Type will instanly give the different results including BAd data drastically turning into a GOOD data so keep that in mind and play with those offsets. There is just one small but very important step left, most of the time those binary files leave also values like Vertex count (UV Coords and Vertex Normals count is the same as Vertex always), Face count, buffer mark and even Vertex stride! (Vertex Stride = Vertex Padding + 12, UV Coords stride = UV Coords stride + 8). They are essentially at the begginning of the mesh buffer and are pretty easy to find and are always placed in the same way hovewer, this time I personally recommend finding them using the dedicated HEX viewer, my recommendadions are IM Hex, truly the open-sourse king in terms of ease of use.
    1 point
  13. Hello! Out of boredom, I decided to replace texture from a Godot game, in this case Slot or Not, by using Python but failing and failing over again exhausted me. My script has data and their MD5 hash code replacement, and data header change. I doubt that something doesn’t work or they are others data somewhere. I also think even if both WebP data from Godot and PIL are working, their data processing are different thus incompatible. Not even replacing them with PNG data works. I suspect there is a checksum for PCK somewhere. I compared its PCK file with Super Mario 127’s. Although the WebP data for CTEX and STEX are the same, the data pointers in SM127 are at the start of the file, after the header; SoN are at the bottom. The PCK for V1 is documented in Xentax wiki but not for V3. It may be small but this comparison shouldn’t be dismissed. What I need: If there is a checksum for PCK, where it is? The Bytes leading to the data header thus making the Python script adaptable If the WebP data is the issue, a better understanding of Godot’s WebP data Feel free to reply if you find anything! Slot or Not (PCK File V3) son PCK File.zip Godot Picture Replacer Script (Work in progress) Godot Pictures Replacer Script.py Note for the creator of Slot Or Not, you don’t have to turn it into an moddable game. The game need its own resource to be good game. I do this because why not and for research purpose. Knowledge GODOT PCK FILE FORMAT (Little Endian) (From Slot or Not) HEADER (112 Bytes) 4 Bytes = Signature ("GDPC") 4 Bytes = 03 00 00 00 (Engine Version) 4 Bytes = 04 00 00 00 (Major) 4 Bytes = 05 00 00 00 (Minor) 4 Bytes = 00 00 00 00 (Revision) 4 Bytes = 02 00 00 00 4 Bytes = 70 00 00 00 (First File Pointer) 4 Bytes = 00 00 00 00 4 Bytes = File Numbers 4 x 19 Bytes = 00s (Reserved) DATA HEADER (Found at the last part of the file) 4 Bytes = Length of Path, Including the 00s X Bytes = Path Name + 00s (If not divisible by 4) 8 Bytes = Data Pointer - 112 8 Bytes = File Size 16 Bytes = Data MD5 Checksum 4 Bytes = 00 00 00 00 GODOT CTEX FILE FORMAT (Little Endian) For WebP Format { 4 Bytes = String GTS2 4 Bytes = Use Alpha or Not (01 = Yes; 00 = No) 4 Bytes = Height 4 Bytes = Width 4 Bytes = 00 00 00 0D 4 Bytes = FF FF FF FF 4 x 3 Bytes = 00s (Reserved?) 4 Bytes = 02 00 00 00 2 Bytes = Height 2 Bytes = Width 4 Bytes = 00s 4 Bytes = 05 00 00 00 4 Bytes = Data Size after this Byte 4 Bytes = String RIFF 4 Bytes = WebP Data Size after this Byte 8 Bytes = WEBPVP8L / Tag for Loseless Encoded Image Data 4 Bytes = Data Size (or Data Size - 1 if WebP Chunk ends with 00) (Rest of the File) X Bytes = WEBP VP8L Chunk (On Python, the settings are Loseless = True, EXIF = False) 0-15 Bytes = 00 for Fill Up if the CTex Data Size isn't a divisible of 16 } [NOTE: You can get the WebP data by removing the GTS2 Header.] For PNG Format { X Bytes = PNG File Data 0-15 Bytes = 00 for Fill Up if the CTex Data Size isn't a divisible of 16 } X Bytes = String for [Remap], Settings, Path, Texture, Vram X Bytes = 00s (if the length the String isn’t a divisible of 16)
    1 point
  14. the lvl file is essentially like a zip/container file containing a bunch of .rbm/.rba (CAFF) files and the .rbm/.rba files are basically containers with the data for any given asset, eg models will include a model(header), scenegraph, textures and may additional things like hits/animation data packed inside too if its relevant to the model and/or not shared between a bunch of other models in which case they are usually packed inside the zpackage rbm files.
    1 point
  15. Hmm, you know there's some hero here who got the trick for PS2? I couldn't get over myself to make use of his dll but it's my firm decision to tackle this and I'll tell the result as soon as I get it working for me. (If it works there's several dozens of PS2 projects I'd need to correct and I fear the amount of work, somehow.) (It's my bet that it has to do with changing the face winding and I'd like to find it out by myself instead of using other people's dll.)
    1 point
  16. For PS2, level_00, chardata.db it's a matter of assembling sub meshes (and maybe a better face creation algo):
    1 point
  17. For the selected values in rectangles you may start at 0x208, 0x20A or 0x20C - none of the point clouds looks promising.
    1 point
  18. Ok, thanks. You rule that format (besides the anims). I'll check the files tomorrow. (If I can't help maybe someone else can, with all the files provided.) Good night. edt: well, being more the "simple analyzer" I focussed on the skeleton (21 bones?) in the dff file and the gar.rws.dec_be_-15_anim_27.rwanm: I think the 5th column here could be the frame time in msec with translation and rotation values to follow: address 0x1b6: 30486 29945 63488 29628 58856 65209 0 0 51393 54393 57171 30685 30720 63488 29628 58856 65209 0 0 32768 32768 59982 30502 59743 63488 29628 58856 65209 34953 15752 0 0 0 30720 63488 63488 29628 0 0 34953 15624 61841 59913 29289 27424 63488 30459 59522 24 0 34953 15624 22619 21982 18934 30704 58259 63488 29628 48 0 34953 15624 30430 60362 20288 17433 25922 63488 29628 72 0 34953 15624 30136 28721 52832 21347 59344 63488 29628 96 0 34953 15624 48489 62528 16374 29259 25922 63488 29628 120 0 34953 15624 60085 59294 26987 30152 58965 63488 29628 144 0 34953 15624 0 0 62248 29560 24551 63488 29628 168 0 34953 15624 15843 53973 27979 30346 23143 63488 29628 192 0 34953 15624 53430 29725 53792 29292 25922 63488 29628 216 0 34953 15624 26788 24635 22352 30525 58965 63488 29628 240 0 34953 15624 0 0 61788 29946 24551 63488 29628 264 0 34953 15752 22777 25316 22954 30638 23143 63488 29628 288 0 34953 15624 61008 30170 26220 address 0x316: 13303 63488 63488 28399 312 0 34953 15624 0 0 27758 30398 29945 63488 29628 336 0 34953 15624 7522 11482 59715 30545 30720 63488 29628 360 0 34953 15752 32768 32768 59982 30502 59743 63488 29628 384 0 34953 15624 61466 29974 59213 57464 63488 63488 30720 408 0 34953 15624 0 0 28454 30232 29945 63488 29628 432 0 34953 15624 51986 54325 58411 30640 30720 63488 29628 456 0 34953 15752 32768 32768 59982 30502 59743 63488 29628 480 0 34953 15752 59840 61645 28387 29176 63488 30184 63488 528 0 34953 15752 21331 23096 25805 30620 58259 63488 29628 552 0 34953 15752 30629 58637 21582 16964 25922 63488 29628 576 0 34953 15752 30156 28689 53876 51939 59344 63488 29628 600 0 34953 15752 47599 62461 15165 29336 25922 63488 29628 624 0 34953 15752 59092 59464 28203 30010 58965 63488 29628 648 0 34953 15752 0 0 62825 28859 24551 63488 29628 672 0 34953 15752 52057 53845 27997 30341 23143 63488 29628 696 0 34953 15752 55057 29609 55317 address 0x476: 29406 25922 63488 29628 720 0 34953 15752 56171 26016 20052 30608 58965 63488 29628 744 0 34953 15752 0 0 61930 29839 24551 63488 29628 768 0 34953 15752 29497 62179 58598 22637 63488 63488 28399 816 0 34953 15752 0 0 28323 30266 29945 63488 29628 840 0 34953 15752 7817 11470 59999 30499 30720 63488 29628 864 0 34953 15752 29655 61689 27187 25651 63488 63488 30720 912 0 34953 15752 0 0 29009 29954 29945 63488 29628 936 0 34953 15752 52130 54326 59350 30587 30720 63488 29628 960 0 29287 16404 44589 16658 52439 //49164 29287 16404 44589 16658 9 16448 21 0 1 2 I checked 40 blocks with a size of 22 bytes but none of the point clouds resembled an animation curve (although you can get some points in a line sometimes).
    1 point
  19. I am uploading all animations for the character gar here. I should have done that in the previous post but it slipped my mind. So now you have a complete set of animations for one character. gar.rws.dec_be_-15_extracted.rar
    1 point
  20. The armature is with the model. When you import the DFF using DragonFF in blender it will have the armature with it. The bones are very small. You can use the side menu GUI to better find the bones. Models and Skeletons are 100% workable. It is the custom renderware animation format that is the problem here. In the files I uploaded in the topic post, there should be two rwanm files. Those are some of the animations. If you have the game I can give you one of my scripts that will extract everything for you so you can go through all the files for testing. I tried figuring this out through the elf file. Let me know if you want my script so you can poke at all the animations in the game. There are two different sets of animations per character even if they don't have a weapon. The animations specific for weapons would be much easier to look at since they are smaller. There are mot files that give some additional information but for the most part they are irrelevant. Mot files for this game is more of a listing of animations and positioning for static model pieces. Not the main animations. I am uploading the mot file so you can see it is mostly just for static pieces. The txt file for the character abbreviated with gar that states all the files for him and the gmobj data base text for it as well. I used a renderware tool "forgot the name of it" that allows me to look at how the files are structured. Sending you a section_tree.txt that was made by that program. I was offered a job to make a modding tool for this game that would include editing animations..... As it stands, I don't believe I will be able to finish it. Will just make everything I have for this game public if that happens. Adding two scripts as well. The gar_rws_test.py script does a simple test on the files for the correct compression to decompress the files so you can get everything basically. The garou_rws_extractor is a slightly upgraded version that needs work and is for extracting the models/animations. On aluigi's website there is a bms script called ougon_kishi_garo. That would be your first step to getting all the files out. Then you will have to use the scripts I posted here. The unfinished one will have errors. The test one will work better. gar_mot.rar gar_files_txt.rar gmobj_DB_Files_txt.rar Section_Tree_txt.rar gar_rws_test.py Garou_RWS_extractor.py
    1 point
  21. Drag and drop .md6mesh files into the script it will convert them into obj files. md6mesh.py md6mesh.zip
    1 point
  22. I fear there's not too many who dealed with PS2 animations in this forum, me included. Do you have skeleton data? I checked the dff mesh, looks doable, but usually I don't like to work from scratch:
    1 point
  23. there is a lot of overlap, in fact the Kameo alpha uses the exact same CAFF version as the Conker Demo. the lvl files are just an archive/container of sorts while the assets themselves are packaged inside individual "CAFF" containers. for example: the file struct is fairly basic for the LVL container itself(for imhex): struct LVL_ENTRY { u32 unk_00; u32* address : u32; }; struct LVL_FILE_TABLE { u32 count; LVL_ENTRY array[count]; }; LVL_FILE_TABLE table @ 0x00; they're similar in a sense that Conker stores the assets inside a "ZPackage" CAFF container instead of using an external container(.LVL) to store the individual CAFF assets. unlike the earlier games like GBTG kameo also stores a lot of data inside pushbuffer commands including things like the triangles/shaders/shaderparams. but as far as models go theyre very similar (as thats what i've spent the most time on) not sure about the other assets tho.
    1 point
  24. animewwise just closes instantly if you try it on there. Regular wwise works https://github.com/mortalis13/Wwise-Unpacker BeyondToolsMod-net9.zip
    1 point
  25. here you go https://github.com/ExIfDev/Cal3d-Noesis/blob/main/fmt_cal3d.py currently didn't bother to add support for animations and morphs but if there is need to ill add them
    1 point
  26. if you've already got the pck files from beyondtools. Try dragging the pck files from the main folder to this python script. Though this only partially extracts the language voicelines but it does extract almost all the music and sound effect files pck_decrypt.py
    1 point
  27. 1. FILE STRUCTURE (script.ptd) | Offset | Size | Content | |--------|------|---------| | `0x00` | 32 bytes | **Header** with signature "PETA" (50 45 54 41) | | `0x20` | 256 bytes | **SBOX** (Substitution Box for decryption) | | `0x120` | 1,728,224 bytes | **Encrypted data** | 2. DECRYPTION & DECOMPRESSION PROCESS script.ptd → Decryption → YKLZ/LZSS → Binary Script 249 YKLZ/LZSS compressed sections found in decrypted data YKLZ/LZSS decompression works correctly 3. DECOMPRESSED SCRIPT ANALYSIS Header identified: `純ロマ = "JUNROMAN" Current issue:Shift-JIS Japanese text not found - appears encrypted even after decompression Example decompressed output: 純ロマ####@###シg##@###ト###4.##X)##イ*##4... 4. SUSPECTED SCRIPT STRUCTURE?? [Shift-JIS Text] [Padding "####"] [Command "@" + 3 params] [More Text]... Parsing logic??? 1. Parser detects `@` (0x40) → Command indicator? 2. Reads next 3 bytes → Command parameters? 3. Processes Shift-JIS text (1-2 byte characters)? 4. Skips padding `#` (0x23) → Alignment bytes 5. GAME EXECUTION FLOW (pcsx2 debugger) fcn.0010e048 (Interpreter) ↓ fcn.0010ded0 (Parser) ↓ fcn.00119fc0 / fcn.0011a0ec (Handles dialogues?) ↓ fcn.00106800 (Context configuration) ↓ fcn.001068d0 (Text rendering) ↓ fcn.0016e400 / fcn.0016e4a8 (Unknown final processing) 6. CURRENT PROBLEM The text appears garbled/encrypted even after YKLZ decompression....... Additional encryption layer** after LZSS decompression??
    1 point
  28. *.abc is font map. Maybe you can add some characters in it. Now for the texture. Original is 32 bit rgba, yours is DXT5 which is not exact as org. Also i noticed you didn't change alpha channel of the char which is crucial for correct display.
    1 point
  29. Introduction This question is probably the most asked one and it makes total sense why, the answer unfortunatelly is pretty generic in it's nature, it depends but if we dive deeper turns out it's not as hard as you think might be here is why I personally think this way... Reverse engineering the game, specifically for asset extraction, requires 4 different steps to create: 1. Extract Game Archive, (Reverse enigneer game's extractioon method, spot a compression method, decrypt xor keys (Rarely)) 2. Reverse Enigneer Binary 3D model files 3. Reverse egnineer Binary Texture files 4. Reverse egnineer the Binary Audio files While those are not extreamly hard to topics to learn, it can took some time to figure them out yourselfe. There are numereous ways to reverse engineer those tasks, you can do it manually via binary inspection, or by using, exploits or even by using leaked Beta Builds or reloaded versions, that often are packed with .PDB files (debug symbols) that can be loaded into Ghidra for near source code, code debugging experience. While the best one is still a binary inspection, there are already dedicated tools for this, for inspecting and extracting manually sample by sample, but currently in time being there aren't any automated programs for this so you must choose to rely on Python scripts. For extracting game archives I recommend QuickBMS for model extraction Model Researcher for Textures Raw Texture Cooker and Audacity for Audio... By extracting all of the game content don't forget about the Headers and Magic Numbers, No matter how Payload loos like, the headers are always the same and often contain super usefull info with them. Graphic Debuggers vs Reverse Engineering This is hot topic is the most intersting one, since yes, Dumping 3D Models and Textures + Recording the Audio's using Graphic Debuggers like RenderDoc, nvidia Nsight Graphics and NinjaRipper Exploit obviously way, way easier than any reverse engineering the proprietary files, it can be done in few minutes vs it can took a few days to mounths in Reverse Engineering so the difference is huge sometimes, hovewer after you reverse engineered the binary files you have access to extreamly fast asset "ripping" speeds without relying on the drawcalls and of course you have access to all of the cut contents and very very easier and faster Map/World "ripping". There are obviously upsides and downsides in both of the methods, I personally recommend using exactly what you need for, if there are already scripts for extracting and maybe even converting some binary proprietary assets then go for it!
    1 point
  30. Got a point cloud character and some normals, edit: got rid of them:
    1 point
  31. Skeleton deformations for the character creator is probably a more accurate term for Veilguards “morph targets” (DAO/DA2 use straight up targets while I/VG use the skeleton to deform morphs with different bone positions) But I’m not a game dev. 😉
    1 point
  32. You could check the MakeH2O_log.txt. If you find a structure like 12 4 4 4 4 4 (for example) the last "4 bytes block" might be alpha uvs (just a wild guess). edit: it's 16 8 8 4 4 here Try using 82ea3, 4 for uvs. Looks promising.
    1 point
  33. The script has been updated and is now output in Lua format whenever possible. format_hotfix_data.py
    1 point
  34. This file stores luac and dat data, so it cannot be processed using the unityfs split script. I wrote a new split script to experimentally disassemble the file content you provided and decompile the lua file. If you want to decompile please enable the -j parameter Basic usage (no decompilation) python pkg.py input.patch output_dir With decompilation (slower) python pkg.py input.patch output_dir -j For decompilation, please download unluac from other locations. After compilation, place the .jar file in the same directory as the script. Due to different compilation environments, errors may occur, so unluac needs to be compiled by yourself. pkg.py
    1 point
  35. I remember to make a request in your github about it. 👍 Somehow, we were not able to see these textures in ImageHeat, only after extraction and decompression. Anyway, for the Switch textures it seems to be an issue as h3x3r said above and I confirm it too. In the attachment you find all the textures in UNIFORM.TEX (including jersey-color) from the Switch version already decompressed. The stock texture file is in the Switch files in the first post (UNIFORM.TEX). In the screenshot below you see the parameters for the jersey-color texture. Maybe useful when you have time to check it to help you fix ImageHeat. UNIFORM Switch decompressed.zip
    1 point
  36. Nobody's making fun of you. However, it would have been useful if you had mentioned not having a computer at the start, instead of having people waste their time on things you can't use. It also sounds like you were harassing another user in DMs for help, which you also need to stop doing. People will help if they want to, and if they have the time.
    1 point
  37. You need to decompress them first. Only then you can succeed... Here's jersey-color after decompression. PS4 format. But there's problem with switch format. ImageHeat doesn't support swizzle format. But RawTex can handle this.
    1 point
  38. But anyway, seeing as I had some time, this QuickBMS script will decompress the files correctly. Still needs some manual manipulation in something like ImageHeat to show the correct image: rat_ps2_dps.zip
    1 point
  39. The game have update and they hard-coded new text in .mpk lua script, because some words have many different meaning depend on the context. With packet sniffing, i observed that the game download some .pak file from easebar.com and put them in .mpk file. These file are encrypted.
    1 point
  40. Decided to extract some key frames. from that .mot file I shared earlier. Wonder if there is any insight. The NaNs are interesting tho.
    1 point
  41. I used the file "tex_DeadSpaceMobile.py" from this GitHub link provided by Sleepyzay Here is the link Sleepyzay mentioned adding the script to the repository in a later post. When you have the file, just add it to the folder "noesisv4474\plugins\python" and you should be good to extract the textures after restarting Noesis or pressing "Reload Plugins" in the "Tools" category on the hotbar.
    1 point
  42. Here my analysis: Header: 24 bytes: [ Int64 EntryCount Int64 ValueCount Int32 Timestamp Int32 Padding ] Buckets: [24-528] bytes, based on allocated bucket TableEntries: EntryCount * [ 8 Bytes Hash(or id?), Int32 RelativeOffset, (formula: text_start = current_entry_offset + 8 + value) Int32 TextLength ] Values: ValueCount * [ Byte[ValueLength] Data ] Null value have zero length and no hash. Successfully unpack and pack, the game load new text normally.
    1 point
  43. by the way if you need names of audio files put thesescript.zip in AetherGazerLauncher\AetherGazer\AetherGazer_Data\StreamingAssets\Windows folder , run process.py then it will change every audio .ys files to proper names.
    1 point
  44. rename it to .awb files then use the lastest vgmstream, works well
    1 point
  45. They are still pck files. I can find many wwise .bnk files in AA462ABBFEC319B665666E14585F97D9_EndfieldBeta with ravioli explorer , RavioliGameTools_v2.10.zip (if you need)and I think quickbms also work. By the way I guess the really wem audio files are in another pck files. there are over 5000 bnk files in AA462ABBFEC319B665666E14585F97D9_EndfieldBeta. That means the bnk files may not store any actual audio files
    1 point
  46. Please use this updated script to repackage the data file. If you have any questions, please let me know so that other capable people or you can continue to process these .pxc files yourself # Update the decompression of pxc file(script 0.2) get FILE_SIZE asize xmath TOC_PTR "FILE_SIZE - 8" goto TOC_PTR get TOC_OFFSET long goto TOC_OFFSET get FILE_COUNT long for i = 0 < FILE_COUNT get OFFSET long get SIZE long get COMP_FLAG byte get NAME_LEN short getdstring NAME NAME_LEN get UNK long savepos TOC_ENTRY_POS if COMP_FLAG == 0 goto OFFSET getdstring MAGIC 4 if MAGIC == "PxZP" comtype zlib get UNCOMP_SIZE long get COMP_SIZE long savepos DATA_START clog NAME DATA_START COMP_SIZE UNCOMP_SIZE else log NAME OFFSET SIZE endif else goto OFFSET get MAGIC long get UNCOMP_SIZE long get COMP_SIZE long savepos COMP_START clog NAME COMP_START COMP_SIZE UNCOMP_SIZE endif goto TOC_ENTRY_POS next i pxc.zip
    1 point
  47. use my plugin for Noesis arc_zlib_plzp_lang_vfs.py (which I mentioned earlier) it recursively unpacks all files, at the output you will get *.png, *.wav, *.pm3, *.vram, *.text, *.pvr and e.t.c You can also find a link to the plugin for 3D models *.vram above in the same topic. (*.pvr can open in PVRTexTool)
    1 point
×
×
  • Create New...