This example shows an agent constructing an .orb schema from a natural language description. The agent uses Masar to plan the construction order, the LLM to generate each piece, and verification to catch errors before they compound.
final = client.verify(schema=schema)
errors = client.error_check(schema=schema)
if final.valid and not errors.top_errors:
print(f"Schema valid ({final.probability:.0%}). Ready to compile.")
else:
print(f"Issues found: {errors.top_errors}")
# One more repair cycle
repairs = client.rank_edits(schema=schema, errors=errors.top_errors)
schema = your_llm.apply_repairs(schema, repairs.suggestions[0])
Without Masar, the LLM would generate the entire schema in one shot. That works for simple cases but fails on complex ones: missed transitions, orphan states, broken wiring between traits. By decomposing the task into 18 ordered steps with verification checkpoints, errors get caught and fixed at the level where they occur, not 15 steps later when everything is tangled.