Boost RAG performance with question decomposer
-
Updated
Jun 26, 2025 - Shell
Boost RAG performance with question decomposer
An evaluation of prompting techniques (Zero-Shot CoT, Few-Shot, Self-Consistency) on the Mistral-7B model for mathematical reasoning. This project systematically benchmarks 7 distinct methods on the GSM8K dataset.
The Effect of Language Representation in Question Decomposition
Add a description, image, and links to the question-decomposition topic page so that developers can more easily learn about it.
To associate your repository with the question-decomposition topic, visit your repo's landing page and select "manage topics."