Self-Hosted Server Setup

Run AI IN A BOX on your own infrastructure for complete control.

Prerequisites

  • Server or VM with Node.js v18+
  • Network access from ServiceNow to your server (or via Mid Server)
  • AI IN A BOX Server Software - Requires NDA (contact us)
  • LLM API key (OpenAI, Azure OpenAI, or self-hosted LLM)

How It Works

AI IN A BOX is stateless. All configuration is sent from ServiceNow with each request:

  • LLM provider URL and API key
  • Model name and parameters
  • Callback URL for streaming responses

The server only needs Node.js and Basic Auth credentials.

Installation

# Get the code (from us after NDA)
cd /opt/aiinabox

# Install dependencies
npm install

# Start the server
npm start

Configure Auth

# In .env or environment
AIIAB_USERNAME=your-username
AIIAB_PASSWORD=your-password

Run as Service (Recommended)

# Install PM2
npm install -g pm2

# Start and save
pm2 start npm --name aiinabox -- start
pm2 save
pm2 startup

Network Options

Option 1: Direct (Public Internet)

Server must be publicly accessible with SSL certificate.

ServiceNow → Internet → Your Server → LLM

Option 2: Via Mid Server

Server only needs to be reachable by your ServiceNow Mid Server.

ServiceNow → Mid Server → Your Server → LLM

Configure ServiceNow

In ServiceNow, go to AI IN A BOX > Command Center and configure:

  • AI IN A BOX Server URL: Your server URL
  • AI IN A BOX Auth: username:password
  • LLM URL: Your LLM endpoint (e.g., https://api.openai.com/v1)
  • LLM API Key: Your API key
  • Model: Model name (e.g., gpt-4)

The Command Center automatically checks server connectivity on page load.

Self-Hosted LLMs

You can run your own LLM with Ollama:

# Install Ollama
curl https://ollama.ai/install.sh | sh

# Pull a model
ollama pull llama2

# Serve it
ollama serve

Then in ServiceNow, set:

  • LLM URL: http://localhost:11434/v1
  • Model: llama2

Troubleshooting

Server won't start

  • Check Node.js version: node --version (must be v18+)
  • Check port availability

ServiceNow can't connect

  • Verify server is running
  • Check firewall rules
  • Verify Basic Auth credentials match

LLM errors

  • Check LLM configuration in ServiceNow
  • Verify LLM API key is valid

Support