A new kind of programming language
Code that costs
90% less to generate.
Aria is the first programming language designed from the ground up for AI code generation. Every syntax decision minimizes tokens. Every feature eliminates boilerplate. The result: your AI writes better code, faster, at a fraction of the cost.
Why does this matter?
AI code generation is expensive
Every token your AI generates costs money. Go needs ~15 tokens per error check. Rust needs ~8. Aria needs 1. Across a full application, that's thousands of dollars in savings.
Boilerplate causes bugs
When an AI generates the same if err != nil pattern hundreds of times, mistakes creep in. Aria's ? operator makes error handling a single character — no pattern to get wrong.
The compiler is the safety net
Exhaustive pattern matching, typed errors, effect tracking, no null, no implicit conversions. The compiler catches what the AI misses — before it ever runs.
Language Features
Every feature exists for a reason: less tokens, more safety, zero ambiguity.
One-Token Error Handling
Propagate errors with a single ?. Context is injected automatically. No wrapping, no boilerplate.
fn loadConfig(path: str) -> Config ! IoError {
content := io.readFile(path)?
json.decode[Config](content)?
} Pipeline Operator
Chain operations left-to-right. Reads like intent, not nested calls.
result := rawData
|> parseJson[Config]?
|> validate?
|> transform
|> json.encode? Expression-Oriented
Everything returns a value. No temporary variables, no mutation needed.
grade := if score >= 90 { "A" }
else if score >= 80 { "B" }
else { "F" }
area := match shape {
Circle{r} => 3.14159 * r * r
Rect{w, h} => w * h
Point => 0.0
} Sum Types & Exhaustive Matching
Define every possible state. The compiler rejects incomplete handling — no missing cases, ever.
type Shape =
| Circle { radius: f64 }
| Rect { w: f64, h: f64 }
| Point
// Compiler error if you miss a variant Effect Tracking
Function signatures declare what side effects they perform. Pure functions stay pure. The compiler enforces it.
// Pure — safe to cache and parallelize
fn total(items: [Item]) -> f64 =
items.map(.price).sum()
// Declares I/O + filesystem effects
fn readConfig(path: str) -> Config ! IoError
with [Io, Fs] { ... } Structured Concurrency
Tasks cannot leak. Errors propagate. All spawned work completes before the scope exits.
scope {
a := spawn fetchUsers()
b := spawn fetchOrders()
}
// Both done. No leaks. No forgotten joins.
select {
msg from ch1 => process(msg)
after 5s => timeout()
} Unambiguous Generics
Square brackets for type parameters. No turbofish. a < b is always a comparison.
fn map[T, U](list: [T], f: fn(T) -> U) -> [U] =
[f(x) for x in list]
fn largest[T: Ord](items: [T]) -> T? {
...
} No Null. No Exceptions.
Option[T] for absence. Result[T, E] for errors. Both checked at compile time. Shortcuts: T? and T ! E.
// T? is sugar for Option[T]
fn find(id: i64) -> User? {
...
}
name := user?.address?.city ?? "unknown" Opt-In Memory Control
GC by default — zero annotations needed. Drop to manual when performance demands it.
x := Thing{...} // GC, invisible
buffer := @stack Buffer.withCapacity(4096)
parsed := @arena(a) parseRequest(req)?
conn := pool.get()
defer pool.put(conn) One-Word Imports
The standard library is designed for the 90% case. One import, one function call.
use io, net, json, db, time, crypto
content := io.readFile("config.json")?
resp := net.get("https://api.example.com")?
data := json.decode[Config](resp.body)? Aria vs. The Alternatives
Same task. Dramatically different token counts.
Error handling across 5 fallible calls
result1, err := doStep1()
if err != nil {
return fmt.Errorf("step1: %w", err)
}
result2, err := doStep2(result1)
if err != nil {
return fmt.Errorf("step2: %w", err)
}
result3, err := doStep3(result2)
if err != nil {
return fmt.Errorf("step3: %w", err)
}
result4, err := doStep4(result3)
if err != nil {
return fmt.Errorf("step4: %w", err)
}
result5, err := doStep5(result4)
if err != nil {
return fmt.Errorf("step5: %w", err)
} let result1 = do_step1()
.context("step1")?;
let result2 = do_step2(result1)
.context("step2")?;
let result3 = do_step3(result2)
.context("step3")?;
let result4 = do_step4(result3)
.context("step4")?;
let result5 = do_step5(result4)
.context("step5")?; result1 := doStep1()?
result2 := doStep2(result1)?
result3 := doStep3(result2)?
result4 := doStep4(result3)?
result5 := doStep5(result4)?
// Context injected automatically Read a file to string
f, err := os.Open("config.json")
if err != nil {
return err
}
defer f.Close()
bytes, err := io.ReadAll(f)
if err != nil {
return err
}
content := string(bytes) let content = std::fs::read_to_string(
"config.json"
)?; std::ifstream file("config.json");
if (!file.is_open()) {
throw std::runtime_error("...");
}
std::string content(
(std::istreambuf_iterator<char>(file)),
std::istreambuf_iterator<char>()
); content := io.readFile("config.json")? Fetch 3 URLs concurrently, handle errors
var wg sync.WaitGroup
errs := make(chan error, 3)
results := make([]string, 3)
for i, url := range urls {
wg.Add(1)
go func(i int, url string) {
defer wg.Done()
resp, err := http.Get(url)
if err != nil {
errs <- err
return
}
defer resp.Body.Close()
body, err := io.ReadAll(resp.Body)
if err != nil {
errs <- err
return
}
results[i] = string(body)
}(i, url)
}
wg.Wait() results := scope {
urls.map(fn(url) =>
spawn net.get(url)
).map(fn(t) => t.await()?)
}
// All tasks complete. Errors propagate.
// No goroutine leaks. No WaitGroup. Feature-by-feature
| Feature | Aria | Go | Rust | C++ |
|---|---|---|---|---|
| Error handling tokens (5 calls) | 5 | ~75 | ~40 | ~50+ |
| Null safety | Option[T], compile-time | nil (runtime panic) | Option<T> | std::optional (C++17) |
| Exhaustive matching | Enforced | No | Enforced | No |
| Concurrency safety | Structured scopes | Goroutine leaks possible | Ownership-based | Manual (threads) |
| Effect tracking | Built-in | No | No | No |
| Implicit conversions | None | None | Deref coercions | Many (dangerous) |
| Generics ambiguity | None ([T]) | Brackets since 1.18 | Turbofish needed | Angle bracket hell |
| GC + manual memory | Both (opt-in granular) | GC only | Manual only | Manual only |
| Learning curve for AI | Minimal | Low | High (borrow checker) | Very high |
| Async coloring | Colorless | Colorless | Colored (async/await) | Colored (co_await) |
See Aria in Action
Real patterns. Minimal syntax. Maximum clarity.
Hello World
mod main
fn greet(name: str) -> str = "Hello, {name}!"
entry {
println(greet("Aria"))
} Type-Safe Config with Defaults
type Config {
host: str = "localhost"
port: u16 = 8080
timeout: dur = 30s
}
cfg := Config{port: 9090}
updated := cfg.{timeout: 60s} Generic Stack with Trait Bounds
type Stack[T] {
items: [T]
}
impl[T] Stack[T] {
fn new() -> Stack[T] = Stack { items: [] }
fn push(mut self, item: T) {
self.items = self.items.append(item)
}
fn pop(mut self) -> T? { ... }
}
impl[T: Display] Display for Stack[T] {
fn display(self) -> str {
items := self.items
.map(fn(x) => x.display())
.join(", ")
"Stack[{items}]"
}
} Pipeline Processing
fn processOrder(raw: str) -> Receipt ! AppError {
raw
|> json.decode[Order]?
|> validate?
|> applyDiscount(0.1)
|> calculateTax
|> generateReceipt?
} Designed by AI, for AI
Aria wasn't designed by a committee or evolved from a 1970s language. It was designed by an AI that generates code every day and knows exactly where existing languages waste tokens, introduce ambiguity, and cause generation errors.
Every token carries meaning
No semicolons. No ceremonial keywords. No boilerplate. If a token doesn't add information, it doesn't exist.
The type system is the AI's pair programmer
Sum types, exhaustive matching, and effect tracking catch mistakes at compile time — the safety net an AI needs.
No implicit behavior, ever
No hidden conversions. No default constructors. No exception unwinding. What you write is what runs.
Performance is opt-in
GC by default for speed of generation. @stack, @arena, and pools when you need control.
Start generating better code
Aria is open source and ready for early adopters.