Skip to content

[Mobile] Model Loading Failure on iOS with onnxruntime-react-native #26931

@haidv2806

Description

@haidv2806

Describe the issue

Describe the Issue

When attempting to load ONNX models (specifically text_encoder.onnx, ~37MB) using onnxruntime-react-native on iOS (Simulator/Device), the InferenceSession.create() method consistently fails with the error "Can't load a model: failed to load model". This issue does not occur on Android, where the same models load and run successfully.

Observed Behavior:

  1. Initial Attempt (Load from Uint8Array buffer):

    • Reading the model file (.onnx) from Expo asset.localUri, converting it to base64, then to a Buffer, and finally to a Uint8Array.
    • Calling InferenceSession.create(modelUint8Array, options) results in the JavaScript error: Error: Can't load a model: failed to load model from buffer.
    • Xcode console shows a native crash/error: -[NSTaggedPointerString stringValue]: unrecognized selector sent to instance .... This suggests a type casting or data corruption issue at the native bridge level when handling the Uint8Array data.
  2. Second Attempt (Load directly from asset.localUri - file path):

    • Calling InferenceSession.create(asset.localUri, options) directly with the local file URI.
    • This also results in the JavaScript error: Error: Can't load a model: failed to load model.
  3. Third Attempt (Fallback: Write to temporary file and load from path):

    • The model content is read from asset.localUri as base64, then written to a temporary file in RNFS.CachesDirectoryPath on iOS.
    • Calling InferenceSession.create(tempFilePath, options) with the path to the temporary file.
    • This similarly fails with the JavaScript error: Error: Can't load a model: failed to load model.

The consistency of the "Can't load a model: failed to load model" error across different loading mechanisms on iOS, despite the model working on Android, strongly suggests an underlying issue within the onnxruntime-react-native's native iOS implementation related to model parsing or initialization from a local file. The unrecognized selector error further points to a potential issue with how file paths or data references are handled internally within the native Objective-C/C++ layers.

To reproduce

  1. Repository: haidv2806/valtec-tts

  2. Environment:

{
  "name": "valtec-tts-example",
  "version": "1.0.0",
  "main": "index.ts",
  "scripts": {
    "start": "expo start",
    "android": "expo run:android",
    "ios": "expo run:ios",
    "web": "expo start --web"
  },
  "dependencies": {
    "expo": "~54.0.30",
    "expo-av": "^16.0.8",
    "onnxruntime-react-native": "^1.23.2",
    "react": "19.1.0",
    "react-native": "0.81.5",
    "react-native-fs": "^2.20.0",
    "vinorm": "^1.0.6"
  },
  "devDependencies": {
    "@types/react": "~19.1.0",
    "typescript": "~5.9.2"
  },
  "private": true,
  "expo": {
    "autolinking": {
      "nativeModulesDir": ".."
    }
  }
}
  1. Code Snippet:
    The issue occurs within the loadModel method, which is responsible for loading ONNX models using InferenceSession.create().

    async loadModel(fileName: string, options: InferenceSession.SessionOptions, assetModule: any): Promise<InferenceSession> {
        console.log(`[TTS] Reading asset for ${fileName}`);
        try {
            const asset = Asset.fromModule(assetModule);
    
            if (!asset.localUri) {
                console.log(`[TTS] Downloading asset: ${fileName}`);
                await asset.downloadAsync();
            }
    
            if (!asset.localUri) {
                throw new Error(`[TTS] asset.localUri is null for: ${fileName}`);
            }
    
            console.log(`[TTS] Reading file from: ${asset.localUri}`);
            const base64 = await RNFS.readFile(asset.localUri.replace('file://', ''), 'base64');
    
            console.log(`[TTS] ${fileName} size (base64 chars): ${base64.length}`);
    
            const modelBuffer = Buffer.from(base64, 'base64');
            return await InferenceSession.create(modelBuffer, options);
        } catch (error: any) {
            console.error(ValtecTTSEngine.TAG, `Failed to load model ${fileName}: ${error.message}`, error);
            throw error;
        }
    }
  2. Steps:

    1. Integrate the provided ValtecTTSEngine class into a React Native Expo project.
    2. Ensure text_encoder.onnx (and other models) are located at ../../model/text_encoder.onnx relative to the engine.
    3. Run the application on an iOS Simulator or a physical iOS device using Xcode.
    4. Call the initialize() method of ValtecTTSEngine.

Expected Result:
Models load successfully, and initialize() completes without errors, similar to behavior on Android.

Actual Result (from Xcode console):

Creating JS object for module 'ExpoAsset'
[TTS] Reading file from: file:///Users/dovanhai/Library/Developer/CoreSimulator/Devices/75C1A704-BD16-44DB-9421-088D48330694/data/Containers/Data/Application/5DB8AEC6-93EC-44C0-B275-006A2DF43A6E/Library/Caches/ExponentAsset-ccb2ca8d259f5b67c521297336aabb30.onnx
[TTS] text_encoder.onnx size (base64 chars): 37444628
'ValtecTTSEngine', 'Failed to load model text_encoder.onnx: Can\'t load a model: failed to load model from buffer', [Error: Can't load a model: failed to load model from buffer]
'ValtecTTSEngine', 'Init failed: Can\'t load a model: failed to load model from buffer', [Error: Can't load a model: failed to load model from buffer]
[TTS] initialize() FAILED ❌
'[TTS] Error message:', 'Can\'t load a model: failed to load model from buffer'
'[TTS] Full error:', [Error: Can't load a model: failed to load model from buffer]
[Error: Can't load a model: failed to load model from buffer]
nw_socket_set_connection_idle [C8.1.1:2] setsockopt SO_CONNECTION_IDLE failed [42: Protocol not available]
nw_socket_set_connection_idle [C8.1.1:2] setsockopt SO_CONNECTION_IDLE failed [42: Protocol not available]
nw_socket_set_connection_idle [C8.1.1:2] setsockopt SO_CONNECTION_IDLE failed [42: Protocol not available]
nw_socket_set_connection_idle [C8.1.1:2] setsockopt SO_CONNECTION_IDLE failed [42: Protocol not available]
nw_socket_set_connection_idle [C8.1.1:2] setsockopt SO_CONNECTION_IDLE failed [42: Protocol not available]
nw_socket_set_connection_idle [C8.1.1:2] setsockopt SO_CONNECTION_IDLE failed [42: Protocol not available]

Urgency

No response

Platform

React Native

OS Version

all

ONNX Runtime Installation

Built from Source

Compiler Version (if 'Built from Source')

No response

Package Name (if 'Released Package')

onnxruntime-react-native

ONNX Runtime Version or Commit ID

1.23.2

ONNX Runtime API

JavaScript

Architecture

Other / Unknown

Execution Provider

NNAPI, Default CPU

Execution Provider Library Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    api:Javascriptissues related to the Javascript APIplatform:mobileissues related to ONNX Runtime mobile; typically submitted using templateplatform:webissues related to ONNX Runtime web; typically submitted using template

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions