Automaland

The AI-native automation platform for Creative Professionals. Build, debug, and run flows in seconds.

UNTESTED

macOS

Apple Silicon & Intel
Warning: Untested on Mac

Download from GitHub

Beta Version

Windows

64-bit Windows 10 or 11

Download from GitHub

Beta Version

Beta Software Notice: Automaland is currently in beta. It is provided "as is" without any warranty of any kind. Please use it at your own risk and ensure you save your work frequently.

View Documentation ↓

Documentation

Core Features

1. The TriPanel Architecture

Automaland splits automation into three coupled layers:

  • UI Panel (JSON Schema): Defines user inputs (Text, File Pickers, Dropdowns).
  • Node.js Panel: The "Brain". Handles file system, networking, and orchestrating logic.
  • Host App Panel (ExtendScript): The "Hands". Executes native commands inside the target application.

2. Watchers (Auto-Triggers)

Configure watchers in Settings to automate flows without opening the app.

  • Folder Watcher: Triggers a flow when a file is added/modified in a specific folder.
  • Schedule Watcher: Runs a flow at a set interval (e.g., every 60 seconds).

Supported AI Models

Automaland supports a wide range of models. Select "Custom Model..." in settings to use a model not listed in the presets.

Provider Preset Models
Google Gemini gemini-3-flash
gemini-3-pro
gemini-3-deep-think
gemini-2.5-pro
gemini-2.5-flash
gemini-2.0-flash
gemini-2.0-flash-lite
OpenAI gpt-5.2
gpt-5.3-codex
gpt-5-mini
o3
o1
o4-mini
gpt-4.5
gpt-4o
Anthropic Claude claude-4-6-opus-latest
claude-4-5-sonnet-latest
claude-4-5-haiku-latest
claude-3-7-sonnet-20250224
claude-3-5-sonnet-latest

Custom & Local Models

You can connect Automaland to local LLMs (like Ollama or LM Studio) or enterprise endpoints.

Using Ollama (Local)

  1. Run Ollama: ollama serve
  2. In Automaland Settings > AI Provider, select Custom.
  3. API Base URL: http://localhost:11434/v1/chat/completions
  4. Model Name: Enter your model (e.g., llama3.2, mistral).
  5. API Key: Enter ollama (any string works).

Using LM Studio

  1. Start LM Studio Local Server.
  2. In Automaland Settings > AI Provider, select Custom.
  3. API Base URL: http://localhost:1234/v1/chat/completions
  4. Model Name: Use the identifier shown in LM Studio.

Host App Setup

Automaland communicates with host applications via a local bridge. No plugins are required in most cases.

Supported Applications

  • ✅ Photoshop
  • ✅ Illustrator
  • ✅ InDesign
  • ✅ Premiere Pro
  • ✅ After Effects
  • ⚠️ Other Apps (Generic Bridge - Experimental)

Troubleshooting Connections

  • Ensure the target app is open before running a flow.
  • On macOS, if you see a permission popup ("Automaland wants to control Photoshop"), click OK.
  • If the app is not detected, check the Logs tab in Settings for "Bridge Error".