Build powerful, extensible DSLs with exceptional developer experience
Spark is a framework for creating declarative domain-specific languages in Elixir. It transforms simple struct definitions into rich, extensible DSLs that come with autocomplete, documentation generation, and sophisticated tooling built right in.
Here's how you can build a data validator DSL with Spark:
defmodule MyApp.PersonValidator do
use MyLibrary.Validator
fields do
required [:name]
field :name, :string
field :email, :string do
check &String.contains?(&1, "@")
transform &String.trim/1
end
end
end
MyApp.PersonValidator.validate(%{name: "Zach", email: " foo@example.com "})
{:ok, %{name: "Zach", email: "foo@example.com"}}The DSL definition itself is clean and declarative:
@field %Spark.Dsl.Entity{
name: :field,
args: [:name, :type],
target: Field,
describe: "A field that is accepted by the validator",
schema: [
name: [type: :atom, required: true, doc: "The name of the field"],
type: [type: {:one_of, [:integer, :string]}, required: true, doc: "The type of the field"],
check: [type: {:fun, 1}, doc: "A function to validate the value"],
transform: [type: {:fun, 1}, doc: "A function to transform the value"]
]
}
@fields %Spark.Dsl.Section{
name: :fields,
entities: [@field],
describe: "Configure the fields that are supported and required"
}
use Spark.Dsl.Extension, sections: [@fields]- π§ Extensible Architecture - Anyone can write extensions for your DSL, making it infinitely customizable
- π§ Smart Autocomplete - Built-in ElixirSense integration provides intelligent code completion and inline documentation in your editor
- π Auto Documentation - Generate comprehensive documentation for your DSL automatically, including all options and usage examples
- β‘ Developer Tools - Mix tasks for formatting, code generation, and
maintaining
locals_without_parensautomatically - π Compile-time Processing - Use transformers to modify DSL structure$ during compilation and verifiers to validate correctness
- π― Type Safety - Rich schema validation ensures DSL usage is correct at compile time with helpful error messages
- π Introspection - Built-in tools to inspect and query DSL definitions programmatically at runtime
Add spark to your list of dependencies in mix.exs:
def deps do
[
{:spark, "~> 2.3"}
]
endThe best way to get started is with our comprehensive tutorial that walks you through building a complete DSL from scratch:
π Get Started with Spark - Build a data validator DSL step by step
- Define your DSL structure using
Spark.Dsl.SectionandSpark.Dsl.Entity - Create your extension with
use Spark.Dsl.Extension - Build your DSL module that users will import
- Add transformers and verifiers for advanced behavior
- Generate helper functions with
Spark.InfoGenerator
Each step is covered in detail in the tutorial above.
- Get Started with Spark - Complete tutorial building a validator DSL
- Writing Extensions - Deep dive into extension development
- Setup Autocomplete - Configure editor integration
- Split Up Large DSLs - Organize complex DSL definitions
- Use Source Annotations - Leverage location tracking for better errors
- HexDocs - Complete API documentation
- Core Modules:
Spark.Dsl.Extension,Spark.Dsl.Entity,Spark.Dsl.Section - Advanced Features:
Spark.Dsl.Transformer,Spark.Dsl.Verifier,Spark.InfoGenerator
Spark is battle-tested and powers all DSLs in the Ash Framework, handling complex real-world applications with thousands of DSL definitions. Whether you're building configuration DSLs, workflow orchestrators, or domain-specific languages for your business logic, Spark provides the foundation for production-grade solutions.
We welcome contributions! Please see our contributing guidelines and feel free to open issues or submit pull requests.
- GitHub - Source code and issue tracking
- Hex.pm - Package repository
- HexDocs - API documentation
- Ash Framework - See Spark in action
- Discord - Community chat
- Forum - Discussion forum
MIT - see LICENSES/MIT.txt for details.