LSH is a Python-based shell wrapper that integrates a Local Large Language Model (LLM) via Ollama. It behaves like a normal shell for standard operations but acts as an intelligent agent when commands fail.
If a command crashes (non-zero exit code), LSH captures the output, sends it to the LLM, and suggests a fix automatically.
- Native Shell Experience: Uses
pty(pseudo-terminals) to ensure colors, formatting, and real-time output streaming work exactly like Bash/Zsh. - Smart Error Recovery: Automatically detects non-zero exit codes and consults the LLM.
- Interactive Tool Bypass: Intelligently detects tools like
vim,htop,ssh, andk9s. It runs these directly without output capturing to ensure full UI compatibility. - Agent Mode: Option to automatically execute the LLM's suggested fix (
AGENT_MODE = True). - Privacy First: Runs entirely locally using Ollama. No shell history or data is sent to the cloud.
- Python 3.8+
- Ollama: You need Ollama installed and running locally.
- An LLM Model: You need to pull a model (e.g.,
gemma,llama3,mistral).
-
Clone or save the script: Save the code as
lsh.py. -
Install Python Dependencies:
pip install -r requirements.txt
-
Prepare Ollama: Make sure Ollama is running and you have the model specified in the script.
# Start Ollama server ollama serve # In a new terminal, pull the model (matches LLM_MODEL in lsh.py) ollama pull gemma3n:latest
Open lsh.py in your text editor to tweak the settings at the top of the file:
Modify IGNORE_LIST in lsh.py to add more tools that should bypass the LLM capture logic (e.g., custom TUI apps):
IGNORE_LIST = {
"vi", "vim", "nvim", "htop", "top", "ssh", "tmux", ...
}Start the shell:
python3 lsh.pyor in agent mode
python3 lsh.py --agentcustomize llm
python3 lsh.py --llm gpt-oss:120bcustomize ollama addr
python3 lsh.py --ollama http://localhost:11434Works exactly like your standard terminal.
lsh:/home/user$ ls -la
lsh:/home/user$ echo "Hello World"If you make a typo or run a command incorrectly:
lsh:/home/user$ gi status
# Output: lsh: command not found: gi
Analyzing failure with gemma3n:latest... Done.
Suggested Fix: git status
Run this command? [y/N] y
# (Runs 'git status' successfully)If you dont know what command to use, just ask shell how to run it.
lsh:/home/user$ List all the text file
# Output: List: all the text file...
Analyzing failure with gemma3n:latest... Done
Suggested Fix: ls *.txt
Run this command? [y/N]- Input Parsing: LSH reads your input and checks if the command is in the
IGNORE_LIST. - PTY Streaming:
- If ignored (e.g.,
vim), it hands over control viasubprocess.run. - If normal (e.g.,
ls,grep), it creates a master/slave pseudo-terminal pair. This allowslshto read output bytes as they are generated (preserving colors and animations) while simultaneously storing them in a buffer.
- If ignored (e.g.,
- Exit Code Check: Upon completion, if the exit code is
0, the buffer is discarded. - LLM Context: If the exit code is
!= 0, the buffered output is sent to the local Ollama instance with a prompt to fix the issue.
- Aliases: Because this runs in Python, aliases defined in your
.bashrcor.zshrcmight not be available unless you explicitly source them or run the shell in a specific interactive mode (though basic path executables work fine). - Environment Variables:
exportcommands run insidelshonly persist for the current session.
MIT License - Feel free to modify and distribute.