12 min read

How to Run a CI/CD Pipeline That Tests Against a Live Local Environment

Run database integration tests inside GitHub Actions using service containers, and test webhook handlers against your local code using a Localtonet HTTP tunnel. No staging server needed.

🔄 CI/CD · GitHub Actions · Integration Testing · Local Environment · Developer Workflow

How to Run a CI/CD Pipeline That Tests Against a Live Local Environment

Unit tests run in the cloud just fine. The tricky part is integration tests that need to call a real API, hit a real database, or verify a webhook flow against a service running on your machine. This guide explains two complementary approaches: running the test environment inside the CI runner itself, and exposing a local environment to an external CI pipeline using a Localtonet tunnel so tests run against exactly the code and data you are working with right now.

⚙️ GitHub Actions examples 🐳 Service containers 🌍 Tunnel-based testing 🔗 Webhook integration testing

The Integration Testing Problem

Unit tests are easy to run in CI because they have no external dependencies. They test a function, mock everything else, and the runner does not need to know anything about the real world. You push, the pipeline runs, you get a green or red result.

Integration tests are different. They verify that different parts of your system work together correctly. Your API and its database, your app and a third-party payment provider, your webhook handler and the service that sends the webhooks. Each of these requires a real running service to test against, not a mock.

Three scenarios come up repeatedly in real projects:

🗄️ Database integration tests Your tests need a real PostgreSQL or MySQL instance to verify queries, migrations, and transactions behave correctly.
🪝 Webhook handler tests Stripe, GitHub, or another service needs to deliver a real webhook to your local handler so you can verify end-to-end processing.
🔗 Third-party API callback tests OAuth flows, payment redirects, and notification callbacks all require a publicly reachable URL that the provider can reach.

Two Complementary Approaches

There is no single answer that covers every scenario. Two approaches work well and are often used together in the same project.

Approach A: Run everything inside the CI runner

Spin up your application, database, and any other dependencies as Docker containers inside the CI runner using GitHub Actions service containers. The test suite runs on the same runner and calls localhost. This is fully automated, requires no external access, and works on every pull request. It covers unit tests, API tests, and database integration tests very well.

Approach B: Expose your local environment to an external trigger

Run your full local environment on your development machine and expose it to the internet using a Localtonet HTTP tunnel. The CI pipeline or external service calls your local environment directly. This is the right approach for webhook testing, OAuth callback testing, and any scenario where a third-party service needs to initiate the connection to your code.

Approach A: Service Containers in GitHub Actions

GitHub Actions supports service containers, Docker containers that run alongside your workflow job and are accessible to it on localhost. This is the cleanest way to run database integration tests in CI without any external setup.

Node.js API with PostgreSQL integration tests

# .github/workflows/integration-tests.yml
name: Integration Tests

on:
  push:
    branches: [main, develop]
  pull_request:
    branches: [main]

jobs:
  integration:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:16
        env:
          POSTGRES_USER: testuser
          POSTGRES_PASSWORD: testpassword
          POSTGRES_DB: testdb
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Set up Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      - name: Run database migrations
        env:
          DATABASE_URL: postgresql://testuser:testpassword@localhost:5432/testdb
        run: npm run migrate

      - name: Run integration tests
        env:
          DATABASE_URL: postgresql://testuser:testpassword@localhost:5432/testdb
          NODE_ENV: test
        run: npm run test:integration

The postgres service container starts before your test steps. The --health-cmd pg_isready option makes GitHub Actions wait until the database is fully ready before running your tests. Your test code connects to localhost:5432 exactly as it would locally.

Python with PostgreSQL and Redis

jobs:
  integration:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:16-alpine
        env:
          POSTGRES_USER: testuser
          POSTGRES_PASSWORD: testpassword
          POSTGRES_DB: testdb
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

      redis:
        image: redis:7-alpine
        ports:
          - 6379:6379
        options: >-
          --health-cmd "redis-cli ping"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.12'

      - name: Install dependencies
        run: pip install -r requirements.txt

      - name: Run integration tests
        env:
          DATABASE_URL: postgresql://testuser:testpassword@localhost:5432/testdb
          REDIS_URL: redis://localhost:6379
        run: pytest tests/integration/ -v
Health checks are essential

Always add options with health check commands to service containers. Without them, the workflow may start running tests before the database has finished initialising, causing random failures that are hard to debug.

Approach B: Expose a Local Environment with Localtonet

Service containers solve the database problem but they cannot help when a third-party service needs to call your code. Stripe cannot deliver a webhook to a GitHub Actions runner. Google cannot redirect an OAuth callback to a Docker container inside CI. For these scenarios, you need your local environment to be reachable from the internet.

The setup is straightforward. Run your application locally, create a Localtonet HTTP tunnel, and configure the third-party service to use the tunnel URL. Your local server receives the real traffic and your test suite asserts the outcome.

1

Start your application locally

Run your full stack application server, database, any dependencies. The application should be in a clean test state, ideally with a seeded test database.

2

Create an HTTP tunnel for your application port

Go to the HTTP tunnel page, set local IP to 127.0.0.1 and the port your app is running on. Click Create and start the tunnel. Note the public HTTPS URL.

3

Configure the third-party service with the tunnel URL

Update webhook endpoints, OAuth redirect URIs, or callback URLs in the external service to use your Localtonet tunnel URL. The service can now reach your local application.

4

Trigger the test and observe

Trigger the event in the external service a test payment, a push to a repo, a form submission. Watch your local application receive the real request and verify the outcome in your test suite or by inspecting your local database.

Webhook Integration Testing in Practice

Webhook testing is the most common reason developers need a public URL during local development. Here is a concrete example using Stripe, but the same pattern applies to GitHub webhooks, Twilio, Shopify, or any other service that delivers events via HTTP.

Testing a Stripe payment webhook locally

1

Start your application with a webhook handler

// webhook.js (Express example)
const express = require('express');
const stripe = require('stripe')(process.env.STRIPE_SECRET_KEY);
const app = express();

app.post('/webhooks/stripe',
    express.raw({ type: 'application/json' }),
    (req, res) => {
        const sig = req.headers['stripe-signature'];
        let event;

        try {
            event = stripe.webhooks.constructEvent(
                req.body,
                sig,
                process.env.STRIPE_WEBHOOK_SECRET
            );
        } catch (err) {
            return res.status(400).send(`Webhook Error: ${err.message}`);
        }

        if (event.type === 'payment_intent.succeeded') {
            const paymentIntent = event.data.object;
            console.log('Payment succeeded:', paymentIntent.id);
            // Update your database, send confirmation email, etc.
        }

        res.json({ received: true });
    }
);

app.listen(3000, () => console.log('Server running on port 3000'));
2

Create a Localtonet HTTP tunnel for port 3000

Start Localtonet and create an HTTP tunnel for port 3000. Note the public HTTPS URL for example https://abc123.localto.net.

3

Register the webhook URL in the Stripe dashboard

In the Stripe dashboard under Developers → Webhooks, add a new endpoint: https://abc123.localto.net/webhooks/stripe. Copy the signing secret and add it to your local environment as STRIPE_WEBHOOK_SECRET.

4

Trigger a test event and verify

In the Stripe dashboard, use Send test webhook to deliver a payment_intent.succeeded event to your endpoint. Your local handler receives the real webhook, processes it, and you can verify the outcome in your database or application logs.

Testing a GitHub webhook locally

# webhook_handler.py (Flask example)
from flask import Flask, request, abort
import hmac
import hashlib
import os

app = Flask(__name__)

@app.route('/webhooks/github', methods=['POST'])
def github_webhook():
    signature = request.headers.get('X-Hub-Signature-256', '')
    secret = os.environ['GITHUB_WEBHOOK_SECRET'].encode()
    body = request.get_data()

    expected = 'sha256=' + hmac.new(secret, body, hashlib.sha256).hexdigest()
    if not hmac.compare_digest(signature, expected):
        abort(401)

    event = request.headers.get('X-GitHub-Event')
    payload = request.json

    if event == 'push':
        branch = payload['ref'].split('/')[-1]
        commit = payload['head_commit']['message']
        print(f'Push to {branch}: {commit}')
        # Trigger your CI logic, notify team, etc.

    return '', 200

if __name__ == '__main__':
    app.run(port=5000)

Create the Localtonet tunnel for port 5000, go to your GitHub repository under Settings → Webhooks → Add webhook, and enter your tunnel URL as the Payload URL. Every push to the repository delivers a real webhook to your local handler.

When to Use Each Approach

Test scenario Service containers in CI Localtonet tunnel
Database queries and migrations Best fit ~ Works but overkill
REST API endpoint tests Best fit ~ Works but overkill
Stripe / payment webhooks Cannot receive external events Best fit
GitHub / GitLab webhooks Cannot receive external events Best fit
OAuth callback testing Provider cannot redirect to runner Best fit
Testing against real production data Runner has no access Direct access to local DB
Pull request automated testing Runs automatically on every PR ~ Requires manual trigger
Database tests
UseService containers in CI
Webhook tests
UseLocaltonet tunnel
OAuth callbacks
UseLocaltonet tunnel
PR automation
UseService containers in CI

Frequently Asked Questions

Can I use service containers with databases other than PostgreSQL?

Yes. Any service with an official Docker image works as a GitHub Actions service container. MySQL, MariaDB, MongoDB, Redis, Elasticsearch, and RabbitMQ all work the same way. Use the services block with the appropriate image, set the environment variables for credentials, and connect from your tests via localhost and the mapped port. Always add health check options so GitHub Actions waits until the service is ready before running your test steps.

My webhook tests need a stable URL that does not change between sessions. How do I get one?

Reserve a fixed subdomain for your HTTP tunnel in the Localtonet dashboard. Once reserved, the tunnel URL stays the same every time you start Localtonet, so you do not have to update the webhook URL in Stripe, GitHub, or any other service between sessions. You can also attach a custom domain to the tunnel for a fully permanent and branded URL.

Can I run the Localtonet client inside a GitHub Actions workflow?

Yes. You can download and start the Localtonet Linux binary as a workflow step, authenticate with a stored secret, and create a tunnel for a service running on the same runner. This is useful for exposing a service inside the CI runner to an external system that needs to reach it. Download the binary, make it executable, run it in the background, and capture the tunnel URL for use in subsequent steps.

How do I handle test database isolation so tests do not interfere with each other?

The standard approaches are: wrapping each test in a transaction that is rolled back after the test runs, creating a fresh schema per test suite and dropping it afterwards, or using a separate database per test worker. Most testing frameworks have built-in support for one of these. In GitHub Actions, the service container database starts fresh for every workflow run, which already gives you isolation between pipeline runs. Within a single run, use your framework's transaction rollback or schema isolation features.

What is the difference between integration tests and end-to-end tests in this context?

Integration tests verify that two or more components of your system work together correctly your API and its database, your handler and the message queue. End-to-end tests verify the full user journey from the browser through the entire stack. Both need real running services. Service containers in CI handle both well for self-contained stacks. The Localtonet tunnel approach adds value when the test involves a third-party service initiating the interaction, which is true for some integration tests (webhooks, OAuth) and some end-to-end tests (payment flows).

Test Against Your Real Local Environment

Create a free Localtonet account, expose your local application with an HTTP tunnel, and point your webhook provider or OAuth service at the tunnel URL. Real events, real data, real tests without a staging deployment.

Create Free Localtonet Account →

Localtonet is a secure multi-protocol tunneling and proxy platform designed to expose localhost, devices, private services, and AI agents to the public internet supporting HTTP/HTTPS tunnels, TCP/UDP forwarding, mobile proxy infrastructure, file server publishing, latency-optimized game connectivity, and developer-ready AI agent endpoint exposure from a single unified control plane.

support