Control a full Linux operating system with natural language, directly in your browser.
Enigma Shell is an experimental web interface that lets you talk to an operating system. Instead of typing complex shell commands, you simply describe what you want to do in plain English.
This project merges two cutting-edge technologies:
- v86: An x86 emulator written in JavaScript that runs a full Alpine Linux image—complete with its own kernel and file system—directly in your browser.
- Ollama: A client that connects to your local Ollama instance to use LLMs (like Llama 3, Mistral, etc.) to translate your natural language instructions into executable
bashcommands.
The result is a fluid and intuitive experience where the LLM becomes your personal systems engineer, driving a VM right before your eyes.
- Full Linux Environment: A true, virtualized Alpine Linux VM, not a simulation.
- Natural Language Control: Give commands like "create a folder named 'projects' and enter it" or "tell me the kernel version."
- 100% Client-Side: All application logic runs in your browser, ensuring responsiveness and privacy.
- Local LLM: Uses your own Ollama instance, keeping your data and prompts private.
- Reactive UI: Built with React, TypeScript, and Vite for a modern and high-performance user experience.
Before you begin, ensure you have the following installed and running on your machine:
- Node.js (v18 or higher)
- Ollama: Download and install Ollama.
- An LLM Model: Pull an instruction-tuned model, e.g.,
ollama pull qwen:4b
- Clone the repository:
git clone https://github.com/eauchs/enigma-shell.git cd enigma-shell - Install dependencies:
npm install
- Run the application:
npm run dev
- Open your browser to the provided address (usually
http://localhost:5173).
Ensure the Ollama application is running in the background before starting the development server.
Once the application loads, the Alpine Linux VM will boot up. Wait for the boot process to complete and for the login prompt to appear.
- Use the main input field to write your objective in natural language.
- Press "Enter".
- Watch as the LLM types and executes the corresponding command in the VM's terminal.
Enigma Shell is a solid foundation. Here are a few ideas to take it to the next level:
- Multi-Step Planning: Integrate a framework like CrewAI (via a Python backend communicating over WebSockets) to allow a team of agents to pursue complex, multi-command objectives (e.g., "install an Nginx server and deploy this website").
- State Persistence: Allow the VM's disk state to be saved in
localStorageorIndexedDBto resume a session where you left off. - File Uploads: Add the ability to upload files from the host machine into the VM's file system.
- Model Selection: Allow the user to choose which Ollama model to use directly from the UI.
This project is distributed under the MIT License. See the LICENSE file for more information.