fix: 修改modal_infer.exe 为单独打包 + CI support #12
Merged
grider-transwithai merged 25 commits intoTransWithAI:mainfrom Jan 15, 2026
Merged
fix: 修改modal_infer.exe 为单独打包 + CI support #12grider-transwithai merged 25 commits intoTransWithAI:mainfrom
grider-transwithai merged 25 commits intoTransWithAI:mainfrom
Conversation
Contributor
Author
|
还需要@grider-transwithai 改一下project.spec到老版本 错误的CI结果 正确的CI结果 |
Member
|
能不能从v1.6 rebase出来 |
PyInstaller 6.17.0 has a KeyError: 'depends' regression in some conda environments. Pin to 6.16.0 until the issue is resolved. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add modal_infer.py for cloud-based GPU inference using Modal platform: - Interactive CLI for GPU selection and configuration - Support for multiple GPU types (T4, A10G, A100, H100, etc.) - Automatic model download and caching on Modal Volume - Batch processing support for accelerated transcription - Micromamba-based image with conda environment from environment-cuda128.yml Add environment-modal.yml for lightweight local client setup: - Minimal dependencies (modal, questionary) - Python 3.10 environment for running modal_infer.py locally Update 使用说明.txt with Modal usage instructions: - Environment setup guide - Modal account registration and token configuration - HuggingFace token setup for model downloads - Step-by-step usage instructions 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Fix file upload path to use session directory consistently - Simplify volume path structure (remove APP_ROOT_REL layer) - Remove run mode selection, default to one-time execution - Add detailed logging for build, execution and download stages - Fix download conflict by adding --force flag - Reorder GPU choices (T4/L4 first for cost efficiency) - Update volume name to Faster_Whisper 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- 文件夹模式下改为逐个上传和处理音频文件,容器复用 - 使用 min_containers=1 保持容器预热 - 排除 mp4 格式,发现时提示用户使用 ffmpeg 转换 - 文件夹模式下结果直接保存到源文件夹,不再创建 _out 子目录 - 单个文件处理失败时继续处理其他文件 注意:仅通过语法检查,未经手动测试验证 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- 上传时将文件名中的空格替换为下划线 - 在 UploadManifest 中记录原始文件名 - 下载字幕后恢复原始文件名(带空格) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- 上传时将文件名改为 todo + 扩展名,避免全角字符问题 - 下载后恢复原始文件名 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- 上传后调用 volume.commit() 强制同步 - 远程执行前等待文件出现(最多 30 秒) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- 远程函数直接返回文件内容(base64 编码) - 本地直接写入文件,使用原始文件名 - 移除 modal volume get 下载逻辑 - 删除不再使用的 RemoteResult 类 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…" 兜底),避免 argparse 打印中文时崩掉。
7e0c942 to
f517369
Compare
Contributor
Author
|
试着从main rebase了一下 ,看起来没问题 |
Member
|
更改里面还是有着一段 # PyInstaller 6.17.0 has a conda hook regression (KeyError: 'depends') in some conda environments
- pyinstaller==6.16.0这个在v1.6里面revert回了 - pyinstaller>=6.0.0你看看把这个更改去掉? |
Contributor
Author
|
已修改 |
Member
|
|
- 删除 REPO_REF = "v1.4" 常量 - 克隆时不再指定 --branch,使用默认分支 - 更新时 reset 到 origin/main 而非固定 tag 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Contributor
Author
|
不是必须的 改成指向main了 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
背景
project.spec改回v1.4/v1.5版本,保持 infer.exe 打包逻辑不变,先把 Modal CLI 打包流程独立为modal.spec并加 CI 校验,降低依赖冲突风险。变更
modal.spec:单文件打包 Modal 客户端,并打包environment-cuda128.yml。environment-modal.yml:本地运行 Modal CLI 的轻量环境。build_windows.py:modal.spec进行打包;modal/questionary,先自动安装;dist/faster_whisper_transwithai_chickenrice,并使用独立build/modal目录。modal_infer.py:argparse --help的编码崩溃);v1.4(避免远端代码漂移)。build-release-conda.yml增加modal_infer.exe --help校验。测试
infer.exe --helpmodal_infer.exe --help