Structured Language for Orchestrating Prompts
A simple, safe, and powerful scripting language designed for AI agent workflows, LLM orchestration, and prompt engineering.
# Simple AI agent in SLOP
user_input = "Hello, world!"
response = llm.call(user_input)
emit response
SLOP is a domain-specific language that makes it easy to build AI agents and orchestrate language model workflows. Think of it as Python meets AI - simple syntax with powerful built-in features for working with LLMs.
- 🎯 Simple - Python-like syntax you can learn in 5 minutes
- 🔒 Safe - Built-in protections against infinite loops and resource exhaustion
- ⚡ Fast - Lightweight Go runtime with streaming support
- 🔌 AI-Native - Native LLM calls, MCP integration, and schema validation
- 📦 Modular - Organize code into reusable agents and modules
# Clone and build
git clone https://github.com/standardbeagle/slop.git
cd slop
go build -o slop ./cmd/slop
# Run your first script
echo 'emit "Hello, SLOP! 🚀"' > hello.slop
./slop run hello.slopCreate agent.slop:
# Define a simple greeting agent
def greet(name):
return "Hello, " + name + "! 👋"
# Use it
message = greet("World")
emit message
Run it:
./slop run agent.slop
# Output: Hello, World! 👋- 🤖 AI Chatbots - Build conversational agents with streaming responses
- 🔄 Workflow Automation - Orchestrate complex LLM workflows
- 📊 Data Processing - Process and validate data for AI applications
- 🛠️ Prompt Engineering - Test and iterate on prompts quickly
- 🌐 Web Apps - Power backends with the SLOP runtime (see chat app example)
Full documentation: standardbeagle.github.io/slop
Quick links:
Stream responses in real-time:
emit "Processing step 1..."
emit "Processing step 2..."
emit "Done! ✅"
Call language models directly:
response = llm.call({
"messages": [{"role": "user", "content": "Hello!"}],
"model": "claude-3-5-sonnet"
})
emit response
Create custom services accessible from SLOP scripts:
// Go code
type MemoryService struct{}
func (m *MemoryService) Call(method string, args []slop.Value, kwargs map[string]slop.Value) (slop.Value, error) {
switch method {
case "read":
// Handle read
return slop.NewStringValue("stored value"), nil
default:
return nil, fmt.Errorf("unknown method: %s", method)
}
}
// Register with runtime
rt := slop.NewRuntime()
rt.RegisterExternalService("memory", &MemoryService{})# SLOP script
data = memory.read(key: "my_key")
emit data
Validate data automatically:
schema = {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer", "minimum": 0}
},
"required": ["name"]
}
# Validation happens automatically
validate(user_data, schema)
Automatic protections for production use:
# Loops are automatically limited
for i in range(1000000): # Safe - won't run forever
process(i)
# Timeouts prevent hanging
with timeout("30s"):
slow_operation()
SLOP is built with a clean, extensible architecture:
- Lexer - Tokenizes SLOP source code
- Parser - Builds an Abstract Syntax Tree (AST)
- Evaluator - Executes the AST with a Go runtime
- Built-ins - Rich standard library for common tasks
- Safety - Automatic limits and protections
All components are well-tested with 200+ unit tests.
A complete AI chat app with React + SLOP backend:
cd examples/chat-app
./start.sh
# Frontend: http://localhost:3000
# Backend: http://localhost:8080Features:
- Real-time streaming responses
- Multiple AI agents
- Vercel AI SDK integration
- Beautiful modern UI
Contributions are welcome! Some ways to help:
- 🐛 Report bugs or request features
- 📖 Improve documentation
- 🔧 Submit pull requests
- 💡 Share your SLOP agents
See CONTRIBUTING.md for guidelines.
MIT License - see LICENSE for details.
- Documentation: standardbeagle.github.io/slop
- GitHub: github.com/standardbeagle/slop
- Issues: github.com/standardbeagle/slop/issues
- Discussions: github.com/standardbeagle/slop/discussions
Built with ❤️ by the SLOP community