Jump to content

Mike2023

Members
  • Posts

    20
  • Joined

  • Last visited

  • Days Won

    2

Mike2023 last won the day on April 22 2024

Mike2023 had the most liked content!

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Mike2023's Achievements

Apprentice

Apprentice (3/14)

  • Collaborator Rare
  • Reacting Well
  • One Month Later
  • First Post
  • Week One Done

Recent Badges

17

Reputation

  1. Had found a few more plants in the other main folder. Here is the current list of offsets. binfbx_loader_source_june_2.zip
  2. Have now added the remaining offset. With the many files you get confused quickly. 🙃 binfbx_loader_source_june.zip
  3. The vegetation is now complete. If you want special offsets for files, just ask. Or should I upload a version of the script that has some offset tools in debug mode and automatically makes the entries in the JSON? Binfbx_Loader_Source_veg.zip
  4. I can understand you. I don't do that every day either. I was more likely to get in here and found it very interesting. Programming is just a hobby for me. If you like, I can give you the raw data of the script, but is still a construction site. According to the motto, if one function works, the other no longer works ... 🤔 For the other users, I have a new version of the script with the JSON file, which now also supports the meshes of the vegetation. Because there are like a double block size for the mesh info table. In addition, the calculation of the block size is now automated. Edit: Some new offsets have been added and the structure of the JSON has been revised. In addition, without a config file, such as the JSON, things look rather bad for an import script, since such binary files are created with a file manager, which now has to take all the properties of the individual data source into account. binfbx_loader_blender_dbs.zip
  5. It's really difficult to say when an import script will be released that will now support every file type. Personally, I am still working on the final version, which does not require a JSON file and calculates the required offsets myself. The beta version is about 80-90 percent finished and now has almost 1000 lines of code. Even after 40-50 file checks you still find new file types. For example, the Flashlight file has a completely different structure for the 2nd level than all the others. It's really difficult to write something like this because there are almost no clear patterns in such files.
  6. The characters are now complete and the van has been added to the vehicles. 😊 Edit: File deleted. Please look for the last post
  7. I have for you this time only a few new settings and the offsets. 🙂 Edit: File deleted. Please look for the last post
  8. Here is a new version which can now process files with staggered data blocks. There are also some other functions in it that could be used later to automate the script. Logically, some new offsets for User bladers! 😊 Edit: File deleted. Please look for the last post
  9. @bladers, Is not a problem, I have the entire FBI vehicle for you here. I look at the others right away. 🙂 @riverence, I found a new file type again, please look at fbi_vehicle_steering_wheel, had to set fixed offsets there. So everything is really included in the files, there is no standard solution for your import script. 🤔 Edit: What I found out about these files is that the construction of the UV, the vertices and faces follow the Lods. Original structure e.g. of Lod5 = UV -> Vertices -Faces and this now UV = Lod5 -> Lod0 -> Vertices = Lod5 -> Lod0 -> Faces = Lod5 -> Lod0. I am currently writing a logic for this structure. A script then comes! 🙂 Edit: File deleted. Please look for the last post
  10. Do you mean the fbi_vehicle and the pickup_2 or the minibus? No problem, the fbi_vehicle consists of several separate meshes. I'll find the offsets for you. 🙂
  11. There are still a few errors in the search script v3, some vertex sizes do not yet fit a few meshes. I have my standard loader here with some current offsets to test. Different meshes are loaded without any major problems, but please pay attention to the settings in the script. Edit: File deleted. Please look for the last post
  12. I redid the script logic, it took a little brain power... 🫡 The script now runs somewhat, you can leave it like that for now. Have fun with it! 😊 AW2_search_offsets_us_v3.zip
  13. The files are so complex that some UV sets for a Lod start in the middle of a data block. Nevertheless, I tried to incorporate automation into the script, which supports different files, not only the characters. An error message occurs to the script run if there are no offsets in the JSON for calculating. Then activate the Lod0_offset and run the script again. Then enter one of the offsets found at Lod0_offset and run the script again to carry out the calculations of the UV, Vertex and Face-Offsets. I still write Parrallel on a script to correctly integrate the shaders from the material files. I have never had much to do with Shader, Tiling and Co in the past!😅 Here would be the next version of the search script for testing... Edit: File deleted. Please look for the last post of the search file.
  14. The files are really very complex and sometimes even built in layers. That is why a dynamic offset calculation of the LODS is necessary, as I will probably pair it to the byte sequence search. But here is a new version of the script with layer support and other improvements. 🙂 Edit: File deleted. Please look for the last post of the search file.
  15. Thank you for the explanation and sorry for getting in touch so late. Here I have something new for you to test. In addition, the manual offsets were stored in a json file. I also have to put a little more time into calculating for the total LOD offsets of the files. Have a nice weekend everyone! Edit: I have a revised version of the script here. Unfortunately, it's still not possible without the Hex Editor, but it's a great help in the search. @riverence For some meshes (e.g. npc_ <- at the beginning), your script must load the vertex data with 20 bytes (vertex size: 16) instead of 32 bytes. Edit: File deleted. Please look for the last post of the search file.
×
×
  • Create New...