Jump to content

Outfit7 starlite engine (pre-2023) 3d models (My Talking Tom 2/My Talking Angela 2 etc)


Recommended Posts

Posted

I analyzed all the asset files again and here are my findings:

If a vertex has more than 1 joint id associated with it, it always has joint weights too. So: rigid transformations where every vertex has a maximum of 1 joint don't use weights, which makes sense. The reverse is not true: All vertices might only have a single joint, but they also have a separate weight value. This also makes sense since you might have a rigid non-animated part with a smooth transition to an animated part.

The maximum number of joints in a single mesh file across all the assets was 57. All the joint indices in the vertex data are divisible by 3. Because they are divided by 3, the maximum number of joints you can reference is a single file is 255/3, or 85. 57 < 85, so it works out. (but why is the index multiplied by 3, why??)

 

11 hours ago, scratchcat579 said:

the 20 float bytes in the header are actually bounds.

Awesome, thank you.

Posted

because of my extractor i was able to extract some textures. the game mostly uses KTX Basis Universal (KTX 11) for textures but also has a weird one.

I tried using PvrTexTool to load the texture but was unsuccessful:

Screenshot2025-02-09180125.png.b930015090fc25781fa7ee4f1be071f8.png

I have attached some of these weird files.

Textures.zip

Posted
57 minutes ago, scratchcat579 said:

because of my extractor i was able to extract some textures. the game mostly uses KTX Basis Universal (KTX 11) for textures but also has a weird one.

I tried using PvrTexTool to load the texture but was unsuccessful:

Screenshot2025-02-09180125.png.b930015090fc25781fa7ee4f1be071f8.png

I have attached some of these weird files.

Textures.zip 16.03 kB · 0 downloads

There are a lot of file names that end in _TEX_RGB_CRUNCH, so I suspect those use "crunch" compression. I looked into it at some point, but didn't get very far.

Here are all the different texture file name suffixes that I've seen:

_TEX_RGB, I think these are all KTX
_TEX_C, (speculation, i haven't been able to decode these) Looks like a KTX cube map maybe
_TEX_RGB_UC, UC = "uncompressed". Either a .png or a .jpg file. The jpg ones often look like loading screens or interstitials.
_MASK01, RGBA png file where each color channel has a different meaning. A: general texture visibility 0 = transparent, 1 = "use base color". R = "secondary color", G = shadows, B = highlights.
_MASK02, RGB png file with additional masks for different channels, these seem to depend on the application. sometimes it's just a tertiary color, sometimes it looks like an emission or a metalness mask
_TEX_RGB_CRUNCH, these I have not been able to decode, I assume they use "crunch" compression
_FX_TEX_A (speculation, i haven't been able to decode these either) alpha-only/BW texture for effect, particles etc

Anyway, a lot of props, furniture and such uses those _CRUNCH textures, so even if I have managed to decode the meshes, they remain untextured

 

Posted
import bmesh
import bpy
import os
import os.path
import struct
from enum import Enum
from mathutils import Vector, Matrix
from typing import TypedDict, Optional

WRAPPER_HEADER_LENGTH = 16
WRAPPER_FOOTER_LENGTH = 28
WRAPPER_LENGTH = WRAPPER_HEADER_LENGTH + WRAPPER_FOOTER_LENGTH
MESH_HEADER_LENGTH = 256
WRAPPER_AND_MESH_HEADER_LENGTH = WRAPPER_LENGTH + MESH_HEADER_LENGTH

SHORT_DIVISOR = 32767.0


class VertexData(Enum):
    NOT_PRESENT = 0xffff
    FLOAT2 = 0x0021
    FLOAT3 = 0x0022
    FLOAT4 = 0x0023
    BYTE4 = 0x0083
    USHORT2 = 0x0103
    SBYTE3_PAD = 0x0183
    SHORT2 = 0x0191
    SHORT3 = 0x0192
    SHORT4 = 0x0193


class Mesh(TypedDict):
    positions: list[Vector]
    normals: list[Vector]
    tangents: list[Vector]
    vertex_color0: list[Vector]
    vertex_color1: list[Vector]
    vertex_color2: list[Vector]
    vertex_color3: list[Vector]
    weight_indices: list[tuple[bytes,bytes,bytes,bytes]]
    weight_weights: list[Vector]
    uv0: list[Vector]
    uv1: list[Vector]
    uv2: list[Vector]
    uv3: list[Vector]
    faces: list[tuple[int, int, int]]
    joint_ids: list[int]
    unknown: list[list[Vector]]


class VertexDataPresent(TypedDict):
    position: VertexData
    normal: VertexData
    tangent: VertexData
    unkn0: VertexData
    vertex_color0: VertexData
    vertex_color1: VertexData
    vertex_color2: VertexData
    vertex_color3: VertexData
    weight_indices: VertexData
    weight_weights: VertexData
    uv0: VertexData
    uv1: VertexData
    uv2: VertexData
    uv3: VertexData
    unkn1: VertexData
    unkn2: VertexData
    unkn3: VertexData
    unkn4: VertexData


class MeshHeader(TypedDict):
    start_of_bind_poses: int
    num_bind_poses: int
    start_of_after_matrix: int
    num_after_matrix: int
    start_of_joint_ids: int
    num_joint_ids: int

    vertex_data_size: int
    num_blend_shapes: int
    face_start_offset: int
    num_face_entries: int
    vertex_stride: int

    vertex_data: VertexDataPresent


size_of_vertex_data = dict(
    [(VertexData.NOT_PRESENT, 0), (VertexData.FLOAT2, 8), (VertexData.FLOAT3, 12), (VertexData.FLOAT4, 16),
     (VertexData.BYTE4, 4), (VertexData.USHORT2, 4), (VertexData.SBYTE3_PAD, 4), (VertexData.SHORT2, 4),
     (VertexData.SHORT3, 6), (VertexData.SHORT4, 8), ])

vertex_data_to_attribute_type = dict(
    [(VertexData.NOT_PRESENT, None), (VertexData.FLOAT2, 'FLOAT2'), (VertexData.FLOAT3, 'FLOAT_VECTOR'),
     (VertexData.FLOAT4, 'QUATERNION'), (VertexData.BYTE4, 'BYTE_COLOR'), (VertexData.USHORT2, 'FLOAT2'),
     (VertexData.SBYTE3_PAD, 'FLOAT_VECTOR'), (VertexData.SHORT2, 'FLOAT2'), (VertexData.SHORT3, 'FLOAT_VECTOR'),
     (VertexData.SHORT4, 'QUATERNION'), ])


def read_mesh_header(data: bytes) -> MeshHeader:
    try:
        header_data: MeshHeader = {'start_of_bind_poses': struct.unpack_from('<I', data, 64)[0],
                                   'num_bind_poses': struct.unpack_from('<I', data, 72)[0],
                                   'start_of_after_matrix': struct.unpack_from('<I', data, 80)[0],
                                   'num_after_matrix': struct.unpack_from('<I', data, 88)[0],
                                   'start_of_joint_ids': struct.unpack_from('<I', data, 96)[0],
                                   'num_joint_ids': struct.unpack_from('<I', data, 104)[0],

                                   'vertex_data_size': struct.unpack_from('<I', data, 120)[0],
                                   'num_blend_shapes': struct.unpack_from('<I', data, 136)[0],
                                   'face_start_offset': struct.unpack_from('<I', data, 144)[0],
                                   'num_face_entries': struct.unpack_from('<I', data, 152)[0],
                                   'vertex_stride': struct.unpack_from('<I', data, 180)[0],

                                   'vertex_data': {'position': VertexData(struct.unpack_from('<H', data, 218)[0]),
                                                   'normal': VertexData(struct.unpack_from('<H', data, 220)[0]),
                                                   'tangent': VertexData(struct.unpack_from('<H', data, 222)[0]),
                                                   'unkn0': VertexData(struct.unpack_from('<H', data, 224)[0]),
                                                   'vertex_color0': VertexData(struct.unpack_from('<H', data, 226)[0]),
                                                   'vertex_color1': VertexData(struct.unpack_from('<H', data, 228)[0]),
                                                   'vertex_color2': VertexData(struct.unpack_from('<H', data, 230)[0]),
                                                   'vertex_color3': VertexData(struct.unpack_from('<H', data, 232)[0]),
                                                   'weight_indices': VertexData(struct.unpack_from('<H', data, 234)[0]),
                                                   'weight_weights': VertexData(struct.unpack_from('<H', data, 236)[0]),
                                                   'uv0': VertexData(struct.unpack_from('<H', data, 238)[0]),
                                                   'uv1': VertexData(struct.unpack_from('<H', data, 240)[0]),
                                                   'uv2': VertexData(struct.unpack_from('<H', data, 242)[0]),
                                                   'uv3': VertexData(struct.unpack_from('<H', data, 244)[0]),
                                                   'unkn1': VertexData(struct.unpack_from('<H', data, 246)[0]),
                                                   'unkn2': VertexData(struct.unpack_from('<H', data, 248)[0]),
                                                   'unkn3': VertexData(struct.unpack_from('<H', data, 250)[0]),
                                                   'unkn4': VertexData(struct.unpack_from('<H', data, 252)[0]), }}

    except ValueError as ve:
        raise RuntimeError(('Unable to parse header data', ve))

    failed_sanity_checks = []
    if header_data['vertex_data_size'] > len(data):
        failed_sanity_checks.append('vertex data size ({0}) too big '.format(header_data['vertex_data_size']))
    if header_data['face_start_offset'] > len(data):
        failed_sanity_checks.append(
            'face start offset ({0}) is past the end of file'.format(header_data['face_start_offset']))
    if header_data['num_face_entries'] % 3 != 0:
        failed_sanity_checks.append(
            'number of face entries ({0}) is not divisible by 3'.format(header_data['num_face_entries']))
    if header_data['face_start_offset'] + header_data['num_face_entries'] * 2 > len(data):
        failed_sanity_checks.append('number of faces ({0}) would run past the end of file at offset ({1})'.format(
            header_data['num_face_entries'], header_data['face_start_offset']))
    if header_data['vertex_stride'] > header_data['vertex_data_size']:
        failed_sanity_checks.append(
            'vertex stride ({0}) is bigger than vertex data size ({1})'.format(header_data['vertex_stride'],
                                                                               header_data['vertex_data_size']))
    if header_data['vertex_data_size'] % header_data['vertex_stride'] != 0:
        failed_sanity_checks.append(
            'vertex data size ({1}) is not evenly divisible by vertex stride ({0})'.format(header_data['vertex_stride'],
                                                                                           header_data[
                                                                                               'vertex_data_size']))
    if header_data[
        'vertex_stride'] > 288:  # 288 comes from 18 fields, each with a 4-element float vector. 18 * 4 * 4 bytes
        failed_sanity_checks.append(
            'vertex stride ({0}) is suspiciously big (>288)'.format(header_data['vertex_stride']))

    calculated_vertex_data_size = size_of_vertex_data[header_data['vertex_data']['position']] + size_of_vertex_data[
        header_data['vertex_data']['normal']] + size_of_vertex_data[header_data['vertex_data']['uv0']] + \
                                  size_of_vertex_data[header_data['vertex_data']['uv1']] + size_of_vertex_data[
                                      header_data['vertex_data']['uv2']] + size_of_vertex_data[
                                      header_data['vertex_data']['uv3']] + size_of_vertex_data[
                                      header_data['vertex_data']['tangent']] + size_of_vertex_data[
                                      header_data['vertex_data']['unkn0']] + size_of_vertex_data[
                                      header_data['vertex_data']['vertex_color0']] + size_of_vertex_data[
                                      header_data['vertex_data']['vertex_color1']] + size_of_vertex_data[
                                      header_data['vertex_data']['vertex_color2']] + size_of_vertex_data[
                                      header_data['vertex_data']['vertex_color3']] + size_of_vertex_data[
                                      header_data['vertex_data']['weight_indices']] + size_of_vertex_data[
                                      header_data['vertex_data']['weight_weights']] + size_of_vertex_data[
                                      header_data['vertex_data']['unkn1']] + size_of_vertex_data[
                                      header_data['vertex_data']['unkn2']] + size_of_vertex_data[
                                      header_data['vertex_data']['unkn3']] + size_of_vertex_data[
                                      header_data['vertex_data']['unkn4']]

    if calculated_vertex_data_size != header_data['vertex_stride']:
        failed_sanity_checks.append(
            'vertex stride ({0}) does not match calculated vertex size ({1})'.format(header_data['vertex_stride'],
                                                                                     calculated_vertex_data_size))

    if len(failed_sanity_checks):
        raise RuntimeError(', '.join(failed_sanity_checks))

    return header_data


def read_vertex_position(data: bytes, offset: int, vtype: VertexData) -> (Vector, int):
    if vtype == VertexData.NOT_PRESENT:
        return None, offset
    if vtype == VertexData.FLOAT3:
        return Vector(struct.unpack_from('<fff', data, offset)), offset + 12
    if vtype == VertexData.SHORT3:
        elems = struct.unpack_from('<hhh', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR, elems[2] / SHORT_DIVISOR)), offset + 6
    if vtype == VertexData.SHORT4:
        elems = struct.unpack_from('<hhhh', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR, elems[2] / SHORT_DIVISOR,
                       elems[3] / SHORT_DIVISOR)), offset + 8
    if vtype == VertexData.BYTE4:
        elems = struct.unpack_from('<HH', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR, 0)), offset + 4
    raise RuntimeError("Unknown position data type " + str(vtype))


def read_vertex_data(data: bytes, offset: int, vtype: VertexData) -> (Vector, int):
    if vtype == VertexData.NOT_PRESENT:
        return None, offset
    if vtype == VertexData.FLOAT2:
        return Vector(struct.unpack_from('<ff', data, offset)), offset + 8
    if vtype == VertexData.FLOAT3:
        return Vector(struct.unpack_from('<fff', data, offset)), offset + 12
    if vtype == VertexData.FLOAT4:
        return Vector(struct.unpack_from('<ffff', data, offset)), offset + 16
    if vtype == VertexData.BYTE4:
        elems = struct.unpack_from('<cccc', data, offset)
        return Vector((int.from_bytes(elems[0], 'little', signed=False) / 255.0,
                       int.from_bytes(elems[1], 'little', signed=False) / 255.0,
                       int.from_bytes(elems[2], 'little', signed=False) / 255.0,
                       int.from_bytes(elems[3], 'little', signed=False) / 255.0
                       )), offset + 4
    if vtype == VertexData.USHORT2:
        elems = struct.unpack_from('<HH', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR)), offset + 4
    if vtype == VertexData.SBYTE3_PAD:
        elems = struct.unpack_from('<cccc', data, offset)
        return Vector((int.from_bytes(elems[0], 'little', signed=True) / 127.0,
                       int.from_bytes(elems[1], 'little', signed=True) / 127.0, int.from_bytes(elems[2], 'little',
                                                                                               signed=True) / 127.0)), offset + 4  # 4th element discarded on purpose
    if vtype == VertexData.SHORT2:
        elems = struct.unpack_from('<hh', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR)), offset + 4
    if vtype == VertexData.SHORT3:
        elems = struct.unpack_from('<hhh', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR, elems[2] / SHORT_DIVISOR)), offset + 6
    if vtype == VertexData.SHORT4:
        elems = struct.unpack_from('<hhhh', data, offset)
        return Vector((elems[0] / SHORT_DIVISOR, elems[1] / SHORT_DIVISOR, elems[2] / SHORT_DIVISOR,
                       elems[3] / SHORT_DIVISOR)), offset + 8
    raise RuntimeError("Unknown data type " + str(vtype))


def read_weight_index(data: bytes, offset: int, vtype: VertexData) -> (tuple[bytes], int):
    if vtype == VertexData.NOT_PRESENT:
        return None, offset
    if vtype == VertexData.USHORT2:
        return struct.unpack_from('<cccc', data, offset), offset + 4
    raise RuntimeError("Unknown weight index data type " + str(vtype))


def read_mesh_data(header: MeshHeader, data: bytes) -> Mesh:
    mesh_data: Mesh = {'positions': [], 'normals': [], 'tangents': [], 'vertex_color0': [], 'vertex_color1': [],
                       'vertex_color2': [], 'vertex_color3': [], 'weight_indices': [], 'weight_weights': [], 'uv0': [],
                       'uv1': [], 'uv2': [], 'uv3': [], 'joint_ids': [], 'unknown': [[], [], [], [], []], 'faces': []}
    current_offset = 256
    vertex_data_end = current_offset + header['vertex_data_size']

    while current_offset < vertex_data_end:
        position, current_offset = read_vertex_position(data, current_offset, header['vertex_data']['position'])
        if position is not None:
            position.resize_3d()
            position = position.xzy  # convert left-handed y-up to right-handed z-up
            mesh_data['positions'].append(position)

        normal, current_offset = read_vertex_data(data, current_offset, header['vertex_data']['normal'])
        if normal is not None:
            normal.resize_3d()
            normal = normal.xzy  # convert left-handed y-up to right-handed z-up
            normal.normalize()
            mesh_data['normals'].append(normal)

        tangent, current_offset = read_vertex_data(data, current_offset, header['vertex_data']['tangent'])
        if tangent is not None:
            tangent.resize_3d()
            tangent = tangent.xzy  # convert left-handed y-up to right-handed z-up
            tangent.normalize()
            mesh_data['tangents'].append(tangent)

        unknown, current_offset = read_vertex_data(data, current_offset, header['vertex_data']['unkn0'])
        if unknown is not None:
            mesh_data['unknown'][0].append(unknown)

        for i in range(4):
            vertex_color, current_offset = read_vertex_data(data, current_offset,
                                                            header['vertex_data']['vertex_color' + str(i)])
            if vertex_color is not None:
                mesh_data['vertex_color' + str(i)].append(vertex_color)

        weight_index, current_offset = read_weight_index(data, current_offset, header['vertex_data']['weight_indices'])
        if weight_index is not None:
            mesh_data['weight_indices'].append(weight_index)

        weight_weight, current_offset = read_vertex_data(data, current_offset, header['vertex_data']['weight_weights'])
        if weight_weight is not None:
            mesh_data['weight_weights'].append(weight_weight)

        for i in range(4):
            uv, current_offset = read_vertex_data(data, current_offset, header['vertex_data']['uv' + str(i)])
            if uv is not None:
                uv.resize_2d()
                mesh_data['uv' + str(i)].append(uv)

        for i in range(1, 5):
            unknown, current_offset = read_vertex_data(data, current_offset, header['vertex_data']['unkn' + str(i)])
            if unknown is not None:
                mesh_data['unknown'][i].append(unknown)

    current_offset = header['face_start_offset']
    face_data_end = header['face_start_offset'] + header['num_face_entries'] * 2
    while current_offset < face_data_end:
        face: tuple[int, int, int] = struct.unpack_from('<HHH', data, current_offset)
        mesh_data['faces'].append((face[0], face[2], face[1]))  # as part of handedness change, need to flip faces
        current_offset += 6

    current_offset = header['start_of_joint_ids']
    joint_id_end = header['start_of_joint_ids'] + header['num_joint_ids'] * 8

    while current_offset < joint_id_end:
        mesh_data['joint_ids'].append(struct.unpack_from('>Q', data, current_offset)[0])
        current_offset += 8

    return mesh_data


def read_matrix_data(mesh_header, data: bytes) -> list[Matrix]:
    bind_poses = []
    for i in range(mesh_header['num_bind_poses']):
        offset = mesh_header['start_of_bind_poses'] + i * 64
        row0 = struct.unpack_from('<ffff', data, offset)
        row1 = struct.unpack_from('<ffff', data, offset + 16)
        row2 = struct.unpack_from('<ffff', data, offset + 32)
        row3 = struct.unpack_from('<ffff', data, offset + 48)
        bind_poses.append(Matrix((row0, row1, row2, row3)))

    return bind_poses


def read_joint_ids(mesh_header, data: bytes) -> list[int]:
    joint_ids = []
    for i in range(mesh_header['num_joint_ids']):
        offset = mesh_header['start_of_joint_ids'] + i * 8
        joint_id = struct.unpack_from('>Q', data, offset)[0]
        joint_ids.append(joint_id)

    return joint_ids


def read_model_data(path: str) -> bytes:
    if os.path.getsize(path) < WRAPPER_AND_MESH_HEADER_LENGTH:
        raise RuntimeError('file {0} is too small'.format(path))

    with open(path, 'rb') as in_file:
        input_data = in_file.read()

    meta_header = struct.unpack_from('<IIII', input_data)
    if meta_header[0] > len(input_data):
        raise RuntimeError('size in file "{0}" header ({1}) is too big'.format(path, meta_header[0]))

    return input_data


seen_vertex_data_types = dict()
max_num_weight_index = 0


def reset_seen_data():
    global seen_vertex_data_types, max_num_weight_index
    seen_vertex_data_types = {'position': dict(), 'normal': dict(), 'tangent': dict(), 'unkn0': dict(),
                              'vertex_color0': dict(), 'vertex_color1': dict(), 'vertex_color2': dict(),
                              'vertex_color3': dict(), 'weight_indices': dict(), 'weight_weights': dict(),
                              'uv0': dict(), 'uv1': dict(), 'uv2': dict(), 'uv3': dict(), 'unkn1': dict(),
                              'unkn2': dict(), 'unkn3': dict(), 'unkn4': dict(), }
    max_num_weight_index = 0


def analyze_single(path: str):
    global max_num_weight_index
    try:
        input_data = read_model_data(path)
        mesh_header = read_mesh_header(input_data[16:])
        for v_data_name in mesh_header['vertex_data']:
            v_data_type = mesh_header['vertex_data'][v_data_name]
            if v_data_type != VertexData.NOT_PRESENT:
                seen_vertex_data_types[v_data_name][v_data_type] = seen_vertex_data_types[v_data_name].get(v_data_type,
                                                                                                           0) + 1

        if mesh_header['vertex_data']['position'] == VertexData.BYTE4:
            print(f"{path}: vertex color as position format")

        num_bind_poses = mesh_header['num_bind_poses']
        if num_bind_poses > max_num_weight_index:
            max_num_weight_index = num_bind_poses

        if num_bind_poses > 255/3:
            print(f"{path}: number of bind poses is > 255/3")

        mesh_data = read_mesh_data(mesh_header, input_data[16:])

        max_num_non_zero_weights = 0

        for i, w_index in enumerate(mesh_data['weight_indices']):
            num_non_zero_weights = 0
            for j in w_index:
                ji = int.from_bytes(j, 'little', signed=False)
                if ji % 3 != 0:
                    print(f"{path}: weight index not divisible by 3 mesh_data['weight_indices'][{i}][{ji}]")
                if ji > 0:
                    num_non_zero_weights += 1

            if num_non_zero_weights > max_num_non_zero_weights:
                max_num_non_zero_weights = num_non_zero_weights

        if max_num_non_zero_weights > 1:
            if mesh_header["vertex_data"]["weight_weights"] == VertexData.NOT_PRESENT:
                print(f"{path}: have more than 1 non-zero weight and weight data not present")

        if max_num_non_zero_weights == 1:
            if mesh_header["vertex_data"]["weight_weights"] != VertexData.NOT_PRESENT:
                print(f"{path}: have exactly 1 non-zero weight and weight is present")


    except RuntimeError as re:
        pass


def read_dir(path: str):
    reset_seen_data()
    for (dirpath, dirnames, filenames) in os.walk(path):
        for name in filenames:
            analyze_single(os.path.join(dirpath, name))
    print(f'seen vertex data types: {seen_vertex_data_types}')
    print(f'max seen weight index: {max_num_weight_index}, (max possible, 255/3: {255/3})')


def process_single(modelPath: str, skeletonPath: Optional[str] = None):
    input_data = read_model_data(modelPath)

    mesh_header = read_mesh_header(input_data[16:])

    mesh_data = read_mesh_data(mesh_header, input_data[16:])

    joint_ids = [0]
    joint_names = ['root']
    if skeletonPath:
        with open(skeletonPath, 'rb') as skeleton_file:
            skeleton_data = skeleton_file.read()

            num_bones = skeleton_data[9]
            start_of_joint_ids = 25

            start_of_skeleton = (num_bones - 1) * 8 + start_of_joint_ids

            for i in range(num_bones - 1):
                offset = start_of_joint_ids + i * 8
                joint_ids.append(struct.unpack_from('>Q', skeleton_data, offset)[0])

            ozz_skeleton = skeleton_data[start_of_skeleton:]
            chars_count = int.from_bytes(ozz_skeleton[21:25], 'little')
            joint_names = [name.decode('utf8') for name in ozz_skeleton[25:25 + chars_count].split(b'\0') if name]

    basename = os.path.basename(modelPath)

    mesh = bpy.data.meshes.new(basename + "-mesh")  # add the new mesh
    obj = bpy.data.objects.new(mesh.name, mesh)
    col = bpy.data.collections["Collection"]
    col.objects.link(obj)
    bpy.context.view_layer.objects.active = obj
    edges = []

    mesh.from_pydata(mesh_data['positions'], edges, mesh_data['faces'])

    mesh.normals_split_custom_set_from_vertices(mesh_data['normals'])

    bpy.context.view_layer.objects.active = obj
    bpy.ops.object.mode_set(mode='EDIT')

    bm = bmesh.from_edit_mesh(mesh)


    id_to_name = dict()

    for joint_id in mesh_data['joint_ids']:
        i = joint_ids.index(joint_id)
        if i > 0:
            group_name = joint_names[i]
            id_to_name[joint_id] = group_name

            obj.vertex_groups.new(name=group_name)



    for uv in range(4):
        uv_name = 'uv' + str(uv)
        uv_indices = mesh_data[uv_name]
        if len(uv_indices):
            mesh.uv_layers.new(name=uv_name)
            mesh.uv_layers[uv_name].active = True
            uv_layer = bm.loops.layers.uv.verify()
            for face in bm.faces:
                for loop in face.loops:
                    loop[uv_layer].uv = uv_indices[loop.vert.index]

    bmesh.update_edit_mesh(mesh)

    if len(mesh_data['uv0']):
        mesh.uv_layers['uv0'].active = True

    bpy.ops.object.mode_set(mode='OBJECT')

    if mesh_header['vertex_data']['weight_indices'] != VertexData.NOT_PRESENT:
        if mesh_header['vertex_data']['weight_weights'] == VertexData.NOT_PRESENT: #rigid transforms, just 1 group per vertex
            group_to_vertex_index = dict()
            for group_name in id_to_name.values():
                group_to_vertex_index[group_name] = []

            for i,indices in enumerate(mesh_data['weight_indices']):
                index = int(int.from_bytes(indices[0], 'little') / 3)
                joint_id = mesh_data['joint_ids'][index]
                group_name = id_to_name[joint_id]
                group_to_vertex_index[group_name].append(i)

            for group_name, vertices in group_to_vertex_index.items():
                obj.vertex_groups[group_name].add(vertices, 1, 'REPLACE')
        else:
            group_to_weight = dict()
            for group_name in id_to_name.values():
                group_to_weight[group_name] = dict()

            for i,(indices,weights) in enumerate(zip(mesh_data['weight_indices'], mesh_data['weight_weights'])):
                for j in range(4):
                    weight = weights[j]
                    if weight < 1e-6:
                        continue

                    index = int(int.from_bytes(indices[j], 'little') / 3)

                    joint_id = mesh_data['joint_ids'][index]
                    group_name = id_to_name[joint_id]
                    vertex_by_weight = group_to_weight[group_name].get(weight, list())
                    vertex_by_weight.append(i)
                    group_to_weight[group_name][weight] = vertex_by_weight

            for group_name, weights in group_to_weight.items():
                for weight, vertices in weights.items():
                    obj.vertex_groups[group_name].add(vertices, weight, 'REPLACE')



    for i in range(4):
        vertex_color_i = 'vertex_color' + str(i)
        color_data = mesh_data[vertex_color_i]
        if len(color_data):
            v_data_type = mesh_header['vertex_data'][vertex_color_i]
            d_type = vertex_data_to_attribute_type[v_data_type]

            attribute = mesh.attributes.new(name=vertex_color_i, type=d_type, domain='POINT')

            for j in range(len(mesh.vertices)):
                if d_type == 'BYTE_COLOR':
                    attribute.data[j].color = color_data[j]
                else:
                    raise RuntimeError(f'Unknown vertex color type: {d_type}')


    for i in range(5):
        unknown_data = mesh_data['unknown'][i]
        if len(unknown_data):
            v_data_type = mesh_header['vertex_data']['unkn' + str(i)]
            d_type = vertex_data_to_attribute_type[v_data_type]

            attribute = mesh.attributes.new(name='unknown' + str(i), type=d_type, domain='POINT')

            for j in range(len(mesh.vertices)):
                if d_type == 'QUATERNION':
                    attribute.data[j].value = unknown_data[j]
                elif d_type == 'BYTE_COLOR':
                    attribute.data[j].color = unknown_data[j]
                else:
                    attribute.data[j].vector = unknown_data[j]


def dump_bind_poses(path: str):
    with open(path, 'rb') as in_file:
        input_data = in_file.read()

    meta_header = struct.unpack_from('<IIII', input_data)
    if meta_header[0] > len(input_data):
        raise RuntimeError('size in file "{0}" header ({1}) is too big'.format(path, meta_header[0]))

    mesh_header = read_mesh_header(input_data[16:])

    matrix_data = read_matrix_data(mesh_header, input_data[16:])
    joint_ids = read_joint_ids(mesh_header, input_data[16:])

    for i, (joint_id, matrix) in enumerate(zip(joint_ids, matrix_data)):
        print(f'''matrix {i},({joint_id:016x}):
{matrix}
decomposed:
{matrix.decompose()}
euler:
{matrix.decompose()[1].to_euler()}

''')
#usage:
process_single(r'some/path/mesh_file.dat', r'other/path/skeleton_file.dat')

Here is the latest version of the mesh importer. It also has a bunch of my analysis code in it too.

If you give the "process_single" function only a single file, it will import just the mesh. If you give it 2 files, the second one is a skeleton and it will read joint names from the second file and  assign them as vertex groups in the mesh. Those vertex groups are then used for skeleton deformation when you select the mesh, then shift-select the skeleton and press Ctrl-P and select "Armature deform" Note: select "Armature deform" specifically, not any of the other options. Do not select "with empty groups", "with envelope weights" or "with automatic weights"

To import a skeleton, the skeleton import script in an older post should still work. I guess I should point out that as it is, the skeleton and the deforming model will not work with any animations. But you can use the skeleton to make your own animations. They just will not be compatible with the in-game animations if those ever get figured out

Posted (edited)

This is so frustrating. I've tried some exporting some simple armatures from blender to fbx and to dae and then converting them with fbx2skel and dae2skel and then importing them back with the skeleton import script and this is the result. This is why the animations don't work. The bones point in the wrong direction:

2I9f5ry.png

I noticed an error in the annotation: the difference in the Y-axis rotation between the "Y" and the "root" bones should be 90 degrees. Anyway, the two imported bones have the same roll, which is incorrect anyway.

Edited by yarcunham
clarification
Posted (edited)
20 hours ago, yarcunham said:

This is so frustrating. I've tried some exporting some simple armatures from blender to fbx and to dae and then converting them with fbx2skel and dae2skel and then importing them back with the skeleton import script and this is the result. This is why the animations don't work. The bones point in the wrong direction:

2I9f5ry.png

I noticed an error in the annotation: the difference in the Y-axis rotation between the "Y" and the "root" bones should be 90 degrees. Anyway, the two imported bones have the same roll, which is incorrect anyway.

this problem was also happening in my unity importer when importing bind poses. i fixed it by doing this to the position: rotation * position.

Edited by scratchcat579
Posted
On 2/9/2025 at 7:18 PM, yarcunham said:

There are a lot of file names that end in _TEX_RGB_CRUNCH, so I suspect those use "crunch" compression. I looked into it at some point, but didn't get very far.

Here are all the different texture file name suffixes that I've seen:

_TEX_RGB, I think these are all KTX
_TEX_C, (speculation, i haven't been able to decode these) Looks like a KTX cube map maybe
_TEX_RGB_UC, UC = "uncompressed". Either a .png or a .jpg file. The jpg ones often look like loading screens or interstitials.
_MASK01, RGBA png file where each color channel has a different meaning. A: general texture visibility 0 = transparent, 1 = "use base color". R = "secondary color", G = shadows, B = highlights.
_MASK02, RGB png file with additional masks for different channels, these seem to depend on the application. sometimes it's just a tertiary color, sometimes it looks like an emission or a metalness mask
_TEX_RGB_CRUNCH, these I have not been able to decode, I assume they use "crunch" compression
_FX_TEX_A (speculation, i haven't been able to decode these either) alpha-only/BW texture for effect, particles etc

Anyway, a lot of props, furniture and such uses those _CRUNCH textures, so even if I have managed to decode the meshes, they remain untextured

 

i think the crunch textures use crnlib: https://github.com/BinomialLLC/crunch/

Posted
On 2/6/2025 at 9:34 PM, yarcunham said:

The bind pose matrices are either the transformation matrix of the bone the vertices are supposed to be bound to, or more likely they're the inverse matrix because that is static and needed for animation: You multiply the vertex position with the inverse transformation matrix of the bone, then you multiply it with the transformation matrix of the bone at the current frame.

The mesh file also has some unknown data after the matrices, which I have just called "after matrix", and after that there are 64-bit joint ids, which correspond to the joint ids defined in the skeleton files (when you can find by searching for "ozz-skeleton" in the dumped asset files)

Here is an annotated picture of where the offsets and numbers are defined in the header:

ph5t44l.png

edit:

For example, If I look at the entry number 17 in the joint id list, it's BE4303DC94F197F0, If I then search for it in the skeleton file, it's entry number 74. Because the root bone doesn't get its own entry, I look at entry number 75 in the skeleton joint name list and that is r_tibiaRibbonTweak_01_uJnt. So that is how you would bind any vertex that references bone BE4303DC94F197F0.

my format version has 4 bytes per id and the ids arent in the ozz for some reason 😠

TOMSKINADOOT.zip

Posted
4 hours ago, scratchcat579 said:

my format version has 4 bytes per id and the ids arent in the ozz for some reason 😠

Okay yeah, the skeleton file is in a different format, but the ozz-skeleton part seems to parse the same at offset  0x4C9.

I have been using ImHex to inspect these files and if you use this pattern in it, it will highlight the known data:

import std.string;
import std.mem;


namespace skeletonfile {
    struct NullStringBase<DataType> {
        DataType data[while(std::mem::read_unsigned($, sizeof(DataType)) != 0x00)];
        DataType null_terminator;
    } [[sealed, format("std::string::impl::format_string"), transform("std::string::impl::format_string")]];

    /**
        A null-terminated ASCII string.
    */
    using NullString = NullStringBase<char>;

    struct JointProperty {
        u16 parentIdx;
        u8 isLeaf;
    };

    struct SoaTranslate {
        float tx[4];
        float ty[4];
        float tz[4];
    };

    struct SoaRotate {
        float rx[4];
        float ry[4];
        float rz[4];
        float rw[4];
    };

    struct SoaScale {
        float sx[4];
        float sy[4];
        float sz[4];
    };

    struct SoaTransform {
        SoaTranslate translations;
        SoaRotate rotations;
        SoaScale scales;
    };

    struct ozzSkeleton {
        NullString magic;
        u32 version;
        u32 numJoints;
        u32 charsCount;
        NullString jointNames[numJoints];
        u32 jointPropertyVersion;

        JointProperty properties[numJoints];

        auto numSoaJoints = (numJoints + 3)/4;

        SoaTransform bindPose[numSoaJoints];
    };
    struct File {
        u8 version;
        u8 unknown[8];
        u8 numBones;
        u8 unused[15];
        u64 jointIds[numBones - 1];

        ozzSkeleton skeleton;
    };
}
//skeletonfile::File file @ 0;


skeletonfile::ozzSkeleton skeletonfile_ozzskeleton_at_0x4C9 @ 0x4C9;

 

Posted (edited)
On 2/9/2025 at 8:03 PM, scratchcat579 said:

because of my extractor i was able to extract some textures. the game mostly uses KTX Basis Universal (KTX 11) for textures but also has a weird one.

I tried using PvrTexTool to load the texture but was unsuccessful:

Screenshot2025-02-09180125.png.b930015090fc25781fa7ee4f1be071f8.png

I have attached some of these weird files

I managed to decode the files: They have 4 extra bytes in the start of the file: the ktx texture (defaultPet_Body_RSCMask_TEX_BC_1705184) should start with «KTX 11», and the crunch textures (candyPet_Body_RSCMask_TEX_BC_2553344, fantasyPet_RSCMask_TEX_BC_3795744 and tropicPet_Body_RSCMask_TEX_BC_3509824) should start with Hx. The reference crunch.exe that is hosted on github did not work on the textures I tried it on. I investigated the files and they used a compression format "11", which the enum in the decompressor source names  cCRNFmtTotal and specifically checks that the format does not match it. Anyway, I was able to find an "enhanced" version of the tool on this page: https://neverwintervault.org/project/nwnee/other/tool/nwn-crunch-enhanced-edition and it was able to decode the files. It also worked on the textures you posted, at least after I removed the 4 extra bytes in front of the file and renamed the files with .crn and .ktx extensions. Here are the textures from your zip:

candyPet_Body_RSCMask_TEX_BC_2553344:
4ft0jqU.png

defaultPet_Body_RSCMask_TEX_BC_1705184: (this looks like one of those mask files I was talking about)
9WC5AMw.png

fantasyPet_RSCMask_TEX_BC_3795744:
bcc1j0s.png

tropicPet_Body_RSCMask_TEX_BC_3509824:
UXjhbom.png

The command line I used was 
nwn_crunch.exe -file "tropicPet_Body_RSCMask_TEX_BC_3509824 - Copy.crn" -out "tropicPet_Body_RSCMask_TEX_BC_3509824 - Copy.png"

Edited by yarcunham
added command line

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...