OnLLM is the platform to run LLM or SLM models using OnnxRuntime directly on low-end devices like low power computers, mobile phones etc. It is cross-platform using Kivy & open-source.
          android          python          open-source          cross-platform          onnxruntime          mobile-ai          ai-chatbot          large-language-models          llm          llm-inference          llm-chatbot          offline-ai          ai-android      
    - 
            Updated
            Oct 30, 2025 
- Python