Skip to content

SLOP: Structured Language for Orchestrating Prompts - A domain-specific language for AI agent workflows

Notifications You must be signed in to change notification settings

standardbeagle/slop

Repository files navigation

SLOP

Structured Language for Orchestrating Prompts

A simple, safe, and powerful scripting language designed for AI agent workflows, LLM orchestration, and prompt engineering.

# Simple AI agent in SLOP
user_input = "Hello, world!"
response = llm.call(user_input)
emit response

Documentation License Go Version


🚀 What is SLOP?

SLOP is a domain-specific language that makes it easy to build AI agents and orchestrate language model workflows. Think of it as Python meets AI - simple syntax with powerful built-in features for working with LLMs.

Why SLOP?

  • 🎯 Simple - Python-like syntax you can learn in 5 minutes
  • 🔒 Safe - Built-in protections against infinite loops and resource exhaustion
  • ⚡ Fast - Lightweight Go runtime with streaming support
  • 🔌 AI-Native - Native LLM calls, MCP integration, and schema validation
  • 📦 Modular - Organize code into reusable agents and modules

📖 Quick Start

Installation

# Clone and build
git clone https://github.com/standardbeagle/slop.git
cd slop
go build -o slop ./cmd/slop

# Run your first script
echo 'emit "Hello, SLOP! 🚀"' > hello.slop
./slop run hello.slop

Your First Agent

Create agent.slop:

# Define a simple greeting agent
def greet(name):
    return "Hello, " + name + "! 👋"

# Use it
message = greet("World")
emit message

Run it:

./slop run agent.slop
# Output: Hello, World! 👋

💡 What Can You Build?

  • 🤖 AI Chatbots - Build conversational agents with streaming responses
  • 🔄 Workflow Automation - Orchestrate complex LLM workflows
  • 📊 Data Processing - Process and validate data for AI applications
  • 🛠️ Prompt Engineering - Test and iterate on prompts quickly
  • 🌐 Web Apps - Power backends with the SLOP runtime (see chat app example)

📚 Documentation

Full documentation: standardbeagle.github.io/slop

Quick links:

🎯 Key Features

Streaming with emit

Stream responses in real-time:

emit "Processing step 1..."
emit "Processing step 2..."
emit "Done! ✅"

Native LLM Integration

Call language models directly:

response = llm.call({
    "messages": [{"role": "user", "content": "Hello!"}],
    "model": "claude-3-5-sonnet"
})
emit response

External Service Integration

Create custom services accessible from SLOP scripts:

// Go code
type MemoryService struct{}

func (m *MemoryService) Call(method string, args []slop.Value, kwargs map[string]slop.Value) (slop.Value, error) {
    switch method {
    case "read":
        // Handle read
        return slop.NewStringValue("stored value"), nil
    default:
        return nil, fmt.Errorf("unknown method: %s", method)
    }
}

// Register with runtime
rt := slop.NewRuntime()
rt.RegisterExternalService("memory", &MemoryService{})
# SLOP script
data = memory.read(key: "my_key")
emit data

Schema Validation

Validate data automatically:

schema = {
    "type": "object",
    "properties": {
        "name": {"type": "string"},
        "age": {"type": "integer", "minimum": 0}
    },
    "required": ["name"]
}

# Validation happens automatically
validate(user_data, schema)

Safety Built-in

Automatic protections for production use:

# Loops are automatically limited
for i in range(1000000):  # Safe - won't run forever
    process(i)

# Timeouts prevent hanging
with timeout("30s"):
    slow_operation()

🏗️ Architecture

SLOP is built with a clean, extensible architecture:

  • Lexer - Tokenizes SLOP source code
  • Parser - Builds an Abstract Syntax Tree (AST)
  • Evaluator - Executes the AST with a Go runtime
  • Built-ins - Rich standard library for common tasks
  • Safety - Automatic limits and protections

All components are well-tested with 200+ unit tests.

📦 Example: Chat Application

A complete AI chat app with React + SLOP backend:

cd examples/chat-app
./start.sh
# Frontend: http://localhost:3000
# Backend: http://localhost:8080

Features:

  • Real-time streaming responses
  • Multiple AI agents
  • Vercel AI SDK integration
  • Beautiful modern UI

View full example →

🤝 Contributing

Contributions are welcome! Some ways to help:

  • 🐛 Report bugs or request features
  • 📖 Improve documentation
  • 🔧 Submit pull requests
  • 💡 Share your SLOP agents

See CONTRIBUTING.md for guidelines.

📄 License

MIT License - see LICENSE for details.

🔗 Links


Built with ❤️ by the SLOP community

About

SLOP: Structured Language for Orchestrating Prompts - A domain-specific language for AI agent workflows

Resources

Contributing

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  

Languages