Skip to content

docs: add XL (4B DiT) model documentation#1005

Merged
ChuxiJ merged 4 commits intomainfrom
docs/xl-model-documentation
Apr 2, 2026
Merged

docs: add XL (4B DiT) model documentation#1005
ChuxiJ merged 4 commits intomainfrom
docs/xl-model-documentation

Conversation

@ChuxiJ
Copy link
Copy Markdown
Contributor

@ChuxiJ ChuxiJ commented Apr 2, 2026

Summary

  • Add XL (4B) DiT model documentation across 4 English doc files
  • README.md: Model Zoo table + GPU recommendation with DiT column
  • INSTALL.md: Download commands + available models + recommendation table
  • GPU_COMPATIBILITY.md: XL support column in GPU tier table
  • Tutorial.md: XL model section + expanded DiT selection summary

Key facts: XL weights ~9GB bf16, min 16GB VRAM (offload), 20GB+ recommended. All LM models fully compatible with XL.

Test plan

  • All markdown tables render correctly (verified locally)
  • HuggingFace links follow consistent format
  • Translations (zh/ja/ko) to follow in separate PR

Closes #993

🤖 Generated with Claude Code

Summary by CodeRabbit

  • Documentation
    • Added an "XL (4B) DiT" section with capability table, install guidance, and compatibility/VRAM notes describing behavior and LM compatibility.
    • Updated GPU VRAM recommendation matrices to show XL DiT support per tier (❌/⚠️/✅), revised offload and fit guidance, and recommend DiT+LM pairings by VRAM.
    • Renamed “Optional models” to “Optional LM models,” added manual install examples and updated tutorial/model selection tables across locales.

Update English documentation to cover the new XL (4B) DiT models
(acestep-v15-xl-base, acestep-v15-xl-sft, acestep-v15-xl-turbo):

- README.md: Add XL models to Model Zoo table, update GPU recommendation
  with DiT column and XL VRAM guidance
- INSTALL.md: Add XL download commands, available models table entries,
  and update "Which Model Should I Choose?" with XL recommendations
- GPU_COMPATIBILITY.md: Add "XL (4B) DiT" column to tier table showing
  support level per tier (❌/⚠️/✅)
- Tutorial.md: Add XL model section and expand DiT selection summary
  with XL variants

Key facts documented:
- XL weights ~9GB (bf16) vs ~4.7GB for 2B
- Min VRAM: 16GB with offload, 20GB+ recommended
- All LM models (0.6B/1.7B/4B) fully compatible with XL

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 2, 2026

📝 Walkthrough

Walkthrough

Adds XL (4B) DiT documentation across multiple locales: updates README and INSTALL with XL model entries and download commands, revises GPU compatibility/VRAM recommendation tables to include XL DiT support markers and offload guidance, and extends Tutorial pages with an “XL (4B) Models” section and DiT selection updates.

Changes

Cohort / File(s) Summary
Model Reference & Installation
README.md, docs/en/INSTALL.md, docs/ja/INSTALL.md, docs/zh/INSTALL.md
Added XL (4B) DiT entries and Hugging Face download commands (acestep-v15-xl-{base,sft,turbo}), renamed “Optional models” to “Optional LM models”, and reworked the “Which Model Should I Choose?” table to recommend both a DiT and an LM per VRAM tier.
GPU Compatibility / Hardware Guidance
docs/en/GPU_COMPATIBILITY.md, docs/ja/GPU_COMPATIBILITY.md, docs/ko/GPU_COMPATIBILITY.md, docs/zh/GPU_COMPATIBILITY.md
Introduced an “XL (4B) DiT” status column (❌/⚠️/✅) per VRAM tier and updated offload/support notes (unsupported ≤12GB, marginal 12–16GB, CPU-offload 16–20GB, full support ≥20GB); adjusted LM/backend/offload/quantization guidance accordingly.
User Guidance / Tutorial
docs/en/Tutorial.md, docs/ja/Tutorial.md, docs/ko/Tutorial.md, docs/zh/Tutorial.md
Added an “XL (4B) Models” section and XL model table (acestep-v15-xl-{turbo,sft,base}), described XL behavior and VRAM requirements, and extended the DiT Selection Summary with XL variant rows (steps, CFG, recommended scenarios).
Locale-Specific Adjustments
docs/ko/GPU_COMPATIBILITY.md, docs/ko/Tutorial.md, docs/zh/INSTALL.md, docs/zh/Tutorial.md
Localized table/value updates to reflect XL DiT support, batch limits, offload recommendations, and added XL model entries in each language's install/tutorial pages.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Poem

🐰 I hopped through guides by lantern light,
tucked XL links in columns neat and bright.
Tables hummed, download lines in tune—
a little nibble, docs now sing like June.
🥕 — the doc-hopping rabbit

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 inconclusive)

Check name Status Explanation Resolution
Linked Issues check ❓ Inconclusive The PR partially addresses issue #993 requirements but appears to be incomplete. While English docs (README.md, INSTALL.md, GPU_COMPATIBILITY.md, Tutorial.md) were updated, several required files are missing. Verify whether INFERENCE.md, GRADIO_GUIDE.md, LoRA_Training_Tutorial.md, and BENCHMARK.md updates are deferred to another PR or whether they should be included in this PR before merging.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: adding XL (4B DiT) model documentation across multiple files in the repository.
Out of Scope Changes check ✅ Passed All documentation changes are directly aligned with issue #993 objectives to add 4B DiT model documentation. No unrelated changes detected.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch docs/xl-model-documentation

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/en/INSTALL.md`:
- Around line 675-677: Replace the non-descriptive repeated "[Link]" labels for
the three model rows with clear, unique link text to satisfy MD059; update the
links for the entries "acestep-v15-xl-base", "acestep-v15-xl-sft", and
"acestep-v15-xl-turbo" so each uses a descriptive label (for example
"acestep-v15-xl-base on Hugging Face", "acestep-v15-xl-sft on Hugging Face",
"acestep-v15-xl-turbo on Hugging Face") instead of "[Link]".

In `@README.md`:
- Around line 256-258: The table rows use non-descriptive link text "[Link]" for
each model entry (`acestep-v15-xl-base`, `acestep-v15-xl-sft`,
`acestep-v15-xl-turbo`); update each markdown link so the anchor text describes
the destination (for example "Hugging Face — acestep-v15-xl-base", "Hugging Face
— acestep-v15-xl-sft", "Hugging Face — acestep-v15-xl-turbo" or similar) instead
of generic "[Link]" to satisfy MD059 and improve accessibility.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: fb349753-7d41-4285-b2b1-e9816e364b97

📥 Commits

Reviewing files that changed from the base of the PR and between b02c147 and 22404af.

📒 Files selected for processing (4)
  • README.md
  • docs/en/GPU_COMPATIBILITY.md
  • docs/en/INSTALL.md
  • docs/en/Tutorial.md

Comment on lines +675 to +677
| **acestep-v15-xl-base** | XL (4B) Base DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
| **acestep-v15-xl-sft** | XL (4B) SFT DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
| **acestep-v15-xl-turbo** | XL (4B) Turbo DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Use descriptive link labels to satisfy markdown lint (MD059).

At Line 675-677, repeated [Link] text is non-descriptive and triggers lint warnings.

Suggested doc-only patch
-| **acestep-v15-xl-base** | XL (4B) Base DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
-| **acestep-v15-xl-sft** | XL (4B) SFT DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
-| **acestep-v15-xl-turbo** | XL (4B) Turbo DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
+| **acestep-v15-xl-base** | XL (4B) Base DiT — higher quality, ≥16GB VRAM | [acestep-v15-xl-base on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
+| **acestep-v15-xl-sft** | XL (4B) SFT DiT — higher quality, ≥16GB VRAM | [acestep-v15-xl-sft on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
+| **acestep-v15-xl-turbo** | XL (4B) Turbo DiT — higher quality, ≥16GB VRAM | [acestep-v15-xl-turbo on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
| **acestep-v15-xl-base** | XL (4B) Base DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
| **acestep-v15-xl-sft** | XL (4B) SFT DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
| **acestep-v15-xl-turbo** | XL (4B) Turbo DiT — higher quality, ≥16GB VRAM | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
| **acestep-v15-xl-base** | XL (4B) Base DiT — higher quality, ≥16GB VRAM | [acestep-v15-xl-base on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
| **acestep-v15-xl-sft** | XL (4B) SFT DiT — higher quality, ≥16GB VRAM | [acestep-v15-xl-sft on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
| **acestep-v15-xl-turbo** | XL (4B) Turbo DiT — higher quality, ≥16GB VRAM | [acestep-v15-xl-turbo on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
🧰 Tools
🪛 markdownlint-cli2 (0.22.0)

[warning] 675-675: Link text should be descriptive

(MD059, descriptive-link-text)


[warning] 676-676: Link text should be descriptive

(MD059, descriptive-link-text)


[warning] 677-677: Link text should be descriptive

(MD059, descriptive-link-text)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/en/INSTALL.md` around lines 675 - 677, Replace the non-descriptive
repeated "[Link]" labels for the three model rows with clear, unique link text
to satisfy MD059; update the links for the entries "acestep-v15-xl-base",
"acestep-v15-xl-sft", and "acestep-v15-xl-turbo" so each uses a descriptive
label (for example "acestep-v15-xl-base on Hugging Face", "acestep-v15-xl-sft on
Hugging Face", "acestep-v15-xl-turbo on Hugging Face") instead of "[Link]".

Comment on lines +256 to +258
| `acestep-v15-xl-base` | ✅ | ❌ | ❌ | ✅ | 50 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | High | High | Easy | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
| `acestep-v15-xl-sft` | ✅ | ✅ | ❌ | ✅ | 50 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | Very High | Medium | Easy | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
| `acestep-v15-xl-turbo` | ✅ | ✅ | ❌ | ❌ | 8 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | Very High | Medium | Medium | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Replace generic [Link] text with descriptive labels (MD059).

At Line 256-258, link text is non-descriptive and will keep markdownlint warnings active.

Suggested doc-only patch
-| `acestep-v15-xl-base` | ✅ | ❌ | ❌ | ✅ | 50 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | High | High | Easy | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
-| `acestep-v15-xl-sft` | ✅ | ✅ | ❌ | ✅ | 50 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | Very High | Medium | Easy | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
-| `acestep-v15-xl-turbo` | ✅ | ✅ | ❌ | ❌ | 8 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | Very High | Medium | Medium | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
+| `acestep-v15-xl-base` | ✅ | ❌ | ❌ | ✅ | 50 | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | High | High | Easy | [acestep-v15-xl-base on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
+| `acestep-v15-xl-sft` | ✅ | ✅ | ❌ | ✅ | 50 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | Very High | Medium | Easy | [acestep-v15-xl-sft on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
+| `acestep-v15-xl-turbo` | ✅ | ✅ | ❌ | ❌ | 8 | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | Very High | Medium | Medium | [acestep-v15-xl-turbo on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
| `acestep-v15-xl-base` ||||| 50 |||||||| High | High | Easy | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
| `acestep-v15-xl-sft` ||||| 50 |||||||| Very High | Medium | Easy | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
| `acestep-v15-xl-turbo` ||||| 8 |||||||| Very High | Medium | Medium | [Link](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
| `acestep-v15-xl-base` ||||| 50 |||||||| High | High | Easy | [acestep-v15-xl-base on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-base) |
| `acestep-v15-xl-sft` ||||| 50 |||||||| Very High | Medium | Easy | [acestep-v15-xl-sft on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-sft) |
| `acestep-v15-xl-turbo` ||||| 8 |||||||| Very High | Medium | Medium | [acestep-v15-xl-turbo on Hugging Face](https://huggingface.co/ACE-Step/acestep-v15-xl-turbo) |
🧰 Tools
🪛 markdownlint-cli2 (0.22.0)

[warning] 256-256: Link text should be descriptive

(MD059, descriptive-link-text)


[warning] 257-257: Link text should be descriptive

(MD059, descriptive-link-text)


[warning] 258-258: Link text should be descriptive

(MD059, descriptive-link-text)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 256 - 258, The table rows use non-descriptive link
text "[Link]" for each model entry (`acestep-v15-xl-base`, `acestep-v15-xl-sft`,
`acestep-v15-xl-turbo`); update each markdown link so the anchor text describes
the destination (for example "Hugging Face — acestep-v15-xl-base", "Hugging Face
— acestep-v15-xl-sft", "Hugging Face — acestep-v15-xl-turbo" or similar) instead
of generic "[Link]" to satisfy MD059 and improve accessibility.

- Update min VRAM from ≥16GB to ≥12GB (offload makes 12GB viable)
- Fix 16-20GB tier note: "XL requires CPU offload below 20GB"
- GPU_COMPATIBILITY: clarify Tier 5 marginal note for 12-16GB range
- Tutorial: use full model names (acestep-v15-xl-*) instead of short
- INSTALL: fix comment dash style in download section

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (2)
docs/en/INSTALL.md (1)

675-677: ⚠️ Potential issue | 🟡 Minor

Replace generic [Link] labels with descriptive link text.

These three rows still use non-descriptive link labels and will continue triggering markdownlint MD059; please switch to distinct labels per model.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/en/INSTALL.md` around lines 675 - 677, The three table rows for model
entries (acestep-v15-xl-base, acestep-v15-xl-sft, and acestep-v15-xl-turbo) use
the generic "[Link]" label which triggers markdownlint MD059; update each link
to have unique, descriptive text (e.g., "acestep-v15-xl-base on Hugging Face",
"acestep-v15-xl-sft on Hugging Face", "acestep-v15-xl-turbo on Hugging Face") so
the anchor text is specific to the model and replaces the generic "[Link]"
labels.
README.md (1)

256-258: ⚠️ Potential issue | 🟡 Minor

Use descriptive Hugging Face link labels in the XL rows.

These anchors are still generic ([Link]) and likely keep MD059 warnings active.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 256 - 258, The table rows for the XL models
(`acestep-v15-xl-base`, `acestep-v15-xl-sft`, `acestep-v15-xl-turbo`) use
generic `[Link]` anchors which trigger MD059; replace each `[Link]` with a
descriptive Hugging Face label (e.g., `[acestep-v15-xl-base on Hugging Face]`,
`[acestep-v15-xl-sft on Hugging Face]`, `[acestep-v15-xl-turbo on Hugging
Face]`) so the anchor text is unique and descriptive while keeping the existing
URLs unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@docs/en/INSTALL.md`:
- Around line 675-677: The three table rows for model entries
(acestep-v15-xl-base, acestep-v15-xl-sft, and acestep-v15-xl-turbo) use the
generic "[Link]" label which triggers markdownlint MD059; update each link to
have unique, descriptive text (e.g., "acestep-v15-xl-base on Hugging Face",
"acestep-v15-xl-sft on Hugging Face", "acestep-v15-xl-turbo on Hugging Face") so
the anchor text is specific to the model and replaces the generic "[Link]"
labels.

In `@README.md`:
- Around line 256-258: The table rows for the XL models (`acestep-v15-xl-base`,
`acestep-v15-xl-sft`, `acestep-v15-xl-turbo`) use generic `[Link]` anchors which
trigger MD059; replace each `[Link]` with a descriptive Hugging Face label
(e.g., `[acestep-v15-xl-base on Hugging Face]`, `[acestep-v15-xl-sft on Hugging
Face]`, `[acestep-v15-xl-turbo on Hugging Face]`) so the anchor text is unique
and descriptive while keeping the existing URLs unchanged.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: fe082e31-22e5-4708-827b-d27e30f8f347

📥 Commits

Reviewing files that changed from the base of the PR and between 22404af and 5356c63.

📒 Files selected for processing (4)
  • README.md
  • docs/en/GPU_COMPATIBILITY.md
  • docs/en/INSTALL.md
  • docs/en/Tutorial.md
✅ Files skipped from review due to trivial changes (1)
  • docs/en/GPU_COMPATIBILITY.md
🚧 Files skipped from review as they are similar to previous changes (1)
  • docs/en/Tutorial.md

Update translated documentation with XL model support:
- zh: INSTALL.md (model recommendation table), GPU_COMPATIBILITY.md
  (tier table with XL column), Tutorial.md (XL section + summary)
- ja: INSTALL.md, GPU_COMPATIBILITY.md, Tutorial.md (same updates)
- ko: GPU_COMPATIBILITY.md, Tutorial.md (XL section + summary table added)

All translations consistent with English docs: ≥12GB VRAM with offload,
20GB+ recommended, all LM models compatible with XL.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/ko/GPU_COMPATIBILITY.md`:
- Around line 11-15: Update the batch-size cells for the listed GPU tiers in the
table so they match the configured limits: change 티어 3 batch value from "1 / 2"
to "2 / 2", change 티어 5 batch value from "2 / 4" to "4 / 4", and change 티어 6b
batch value from "4 / 8" to "8 / 8"; verify the same corrections are applied
consistently across other locale docs that show these tiers (refer to the rows
labeled "티어 3", "티어 5", and "티어 6b") to avoid desynchronization.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: a41786ee-2f58-4e9d-9690-152334183892

📥 Commits

Reviewing files that changed from the base of the PR and between 5356c63 and 29517bd.

📒 Files selected for processing (8)
  • docs/ja/GPU_COMPATIBILITY.md
  • docs/ja/INSTALL.md
  • docs/ja/Tutorial.md
  • docs/ko/GPU_COMPATIBILITY.md
  • docs/ko/Tutorial.md
  • docs/zh/GPU_COMPATIBILITY.md
  • docs/zh/INSTALL.md
  • docs/zh/Tutorial.md
✅ Files skipped from review due to trivial changes (3)
  • docs/zh/Tutorial.md
  • docs/ko/Tutorial.md
  • docs/ja/Tutorial.md

- ≥24GB tier: note xl-base for extract/lego/complete (en/zh/ja)
- Fix pre-existing Korean GPU tier batch size discrepancies:
  Tier 3: 1/2 → 2/2, Tier 5: 2/4 → 4/4, Tier 6b: 4/8 → 8/8

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
docs/ko/GPU_COMPATIBILITY.md (1)

16-18: Standardize number+unit spacing for readability consistency.

On Line 16 and Line 18, consider using spaced units (24 GB, 9 GB, 4.7 GB) instead of 24GB, 9GB, 4.7GB to keep terminology formatting consistent across the doc.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/ko/GPU_COMPATIBILITY.md` around lines 16 - 18, Replace contiguous
number+unit tokens with a space between number and unit for consistency: change
occurrences like "≥24GB", "24GB", "9GB", "4.7GB" to "≥24 GB", "24 GB", "9 GB",
"4.7 GB" (and similarly "0.6B, 1.7B, 4B" if you want consistent spacing to "0.6
B, 1.7 B, 4 B" only where it matches doc style). Update the "XL (4B) DiT 열" row
and the explanatory sentence ("XL 모델 가중치 약 9GB (bf16), 2B는 약 4.7GB") to use
spaced units ("XL (4 B) DiT 열" and "약 9 GB (bf16), 2 B는 약 4.7 GB") to
standardize formatting across the document.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@docs/ko/GPU_COMPATIBILITY.md`:
- Around line 16-18: Replace contiguous number+unit tokens with a space between
number and unit for consistency: change occurrences like "≥24GB", "24GB", "9GB",
"4.7GB" to "≥24 GB", "24 GB", "9 GB", "4.7 GB" (and similarly "0.6B, 1.7B, 4B"
if you want consistent spacing to "0.6 B, 1.7 B, 4 B" only where it matches doc
style). Update the "XL (4B) DiT 열" row and the explanatory sentence ("XL 모델 가중치
약 9GB (bf16), 2B는 약 4.7GB") to use spaced units ("XL (4 B) DiT 열" and "약 9 GB
(bf16), 2 B는 약 4.7 GB") to standardize formatting across the document.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 04772c88-2e3d-4551-88fb-232410129a01

📥 Commits

Reviewing files that changed from the base of the PR and between 29517bd and 0197b25.

📒 Files selected for processing (5)
  • README.md
  • docs/en/INSTALL.md
  • docs/ja/INSTALL.md
  • docs/ko/GPU_COMPATIBILITY.md
  • docs/zh/INSTALL.md
✅ Files skipped from review due to trivial changes (3)
  • docs/en/INSTALL.md
  • docs/zh/INSTALL.md
  • README.md
🚧 Files skipped from review as they are similar to previous changes (1)
  • docs/ja/INSTALL.md

@ChuxiJ ChuxiJ merged commit d5c58e5 into main Apr 2, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

docs: add 4B DiT model documentation across all languages

1 participant