Skip to main content

Self-Hosted PaaS Alternative Gateway

A self-hosted PaaS alternative gateway provides the developer experience and features of managed platforms without surrendering control of your infrastructure. It handles , custom domains, load balancing, authentication, observability, and security—perfect for teams migrating from Heroku or Vercel while keeping the operational polish.

With this setup, you can:

  • Load balance across multiple app instances with automatic failover
  • Terminate TLS and use custom domains professionally
  • Add authentication and security without changing your application code
  • Monitor traffic and performance with comprehensive logging
  • Protect against attacks with -like rules and rate limiting
  • Handle geo-aware routing and DDoS protection at the edge

1. Create endpoints for your application instances

Start multiple Agent Endpoints with the same URL to create an for automatic load balancing. Replace $PORT with your application ports. You can also use one of our SDKs or the Kubernetes Operator.

If you only have one instance now, you can add additional replicas later for redundancy through load balancing.

Loading…
tip

By using the same URL with --pooling-enabled, ngrok automatically creates an Endpoint Pool that distributes traffic round-robin across all healthy instances with automatic failover.

2. Reserve a domain

Navigate to the Domains section of the ngrok dashboard to use your free dev domain, click New + to reserve a custom domain like https://your-service.ngrok.app, or use a custom domain you already own.

We'll refer to this domain as $NGROK_DOMAIN from here on out.

tip

For a production PaaS alternative, consider using a custom domain you own instead of an ngrok subdomain.

3. Create a Cloud Endpoint

Navigate to the Endpoints section of the ngrok dashboard, then click New + and Cloud Endpoint.

In the URL field, enter the domain you just reserved to finish creating your Cloud Endpoint.

4. Create a vault and secrets

Store your authentication and API secrets securely using Traffic Policy Secrets.

First, create a vault to store your application secrets:

Loading…

Then add your secrets using the vault ID from the response:

Loading…

5. Apply Traffic Policy to your Cloud Endpoint

While still viewing your new cloud endpoint in the dashboard, copy and paste the policy below into the editor. Make sure you change each of the following values:

  • $GITHUB_CLIENT_ID: Replace with your GitHub OAuth app client ID
  • $YOUR_ADMIN_EMAIL: Replace with your admin email address for OAuth access
  • $YOUR_ADMIN_USERNAME: Replace with the username for API authentication
  • https://service.internal: Replace with your actual pooled endpoint URL
Loading…

What's happening here? This policy creates a production-ready PaaS alternative. On every HTTP request, the policy enforces WAF protection, DDoS mitigation via rate limiting, bot blocking for unwanted crawlers and Tor traffic, and authentication for admin/API routes, before forwarding your request to the pool of upstream services.

On every HTTP response, the policy enforces WAF rules on the headers and body, then adds common security headers to the response.

6. Try out your endpoint

Visit the domain you reserved either in the browser or in the terminal using a tool like curl. You should see the app or service at the port connected to your internal Agent Endpoint.

Test a few of the features you added with Traffic Policy:

Loading…

What's next?