Wallarm Node Deployment Options¶
Wallarm supports multiple deployment models — from Security Edge and Kubernetes to cloud VMs and API gateways. Choose by who hosts the Node, the infrastructure that handles your traffic, and whether you need inline or out-of-band protection.
Security Edge¶
Fully managed at the edge: no Node to run. Traffic goes to Wallarm’s edge, is filtered, and forwarded to your origin. Choose Security Edge for zero-infrastructure API security.
Kubernetes¶
Choose a Kubernetes option if your APIs run in-cluster and you want in-cluster protection or a connector to your mesh/gateway.
Istio
gRPC-based external processing filter for Istio-managed APIs
NGINX Ingress Controller
Deploy the NGINX Ingress Controller with integrated Wallarm services
Kong Ingress Controller
Deploy Wallarm to secure APIs managed by Kong Ingress Controller
Helm Chart for Native Node
Run the Native Node in Kubernetes (for connectors and Istio filter)
Sidecar Proxy
Deploy Wallarm Sidecar controller for pod security
eBPF (out-of-band)
Out-of-band deployment on Kubernetes using the eBPF technology
Cloud platforms¶
Choose a cloud option if you run in a public or private cloud and want ready-to-use artifacts (AMIs, Docker, Terraform, etc.).
Amazon Web Services
Artifacts for Wallarm deployment on AWS
Google Cloud
Artifacts for Wallarm deployment on GCP
Microsoft Azure
Artifacts for Wallarm deployment on Microsoft Azure
Alibaba Cloud
Artifacts for Wallarm deployment on Alibaba Cloud
Heroku
Build a Wallarm Docker image and run it on Heroku
Private Cloud
Deploy Wallarm in a private or hybrid cloud
Amazon Web Services
Artifacts for Wallarm deployment on AWS
AMI for NGINX Node
Use the official Amazon Machine Image for NGINX Node (in-line)
AMI for Native Node
Use the official Amazon Machine Image for Native Node (connectors)
Docker on ECS
Use the Docker image with Elastic Container Service
Terraform module
Use the Terraform module for Wallarm deployment on AWS
Terraform module
Use the Terraform module for Wallarm deployment on AWS
Overview
Terraform module for Wallarm on AWS
Proxy in AWS VPC
Wallarm as proxy in AWS Virtual Private Cloud
Proxy for Amazon API Gateway
Wallarm as proxy for Amazon API Gateway protection
Google Cloud
Artifacts for Wallarm deployment on GCP
Machine Image for NGINX Node
Use the official Google Cloud Machine Image for NGINX Node
Docker on GCE
Use the Docker image with Google Compute Engine
Microsoft Azure
Artifacts for Wallarm deployment on Microsoft Azure
Azure Container Instances
Use the Docker image with Azure Container Instances
Alibaba Cloud
Artifacts for Wallarm deployment on Alibaba Cloud
Docker on ECS
Use the Docker image with Alibaba Elastic Compute Service
API gateways¶
Choose an API gateway connector if traffic already flows through a gateway and you want to add protection alongside it.
CDN¶
Choose a CDN or edge integration if your traffic is fronted by a CDN and you want protection at the edge.
Akamai EdgeWorkers
Secure APIs running on Akamai EdgeWorkers
CloudFront
Deploy Wallarm to secure traffic delivered through Amazon CloudFront
Azion Edge
Secure APIs running on Azion Edge
Cloudflare
Deploy Wallarm to secure traffic running via Cloudflare
Fastly
Deploy Wallarm to secure APIs running on Fastly
API management platform¶
Choose an API management connector if you expose APIs through one of these platforms and want to add security without changing it.
Mulesoft
Use Wallarm Node to secure APIs managed by Mulesoft
Azure API Management
Deploy Wallarm to secure APIs managed by Azure API Management
Apigee
Secure APIs running on Apigee
IBM API Connect
Deploy Wallarm to secure APIs managed through IBM API Connect
Mulesoft
Use Wallarm Node to secure APIs managed by Mulesoft
MuleSoft Flex Gateway
Deploy Wallarm to secure APIs managed by the Flex Gateway
MuleSoft Mule Gateway
Deploy Wallarm to secure APIs managed by the Mule Gateway
TCP traffic mirror¶
Deploy the Wallarm Node for TCP traffic mirror analysis when you need out-of-band analysis of network-layer mirrored traffic. Wallarm analyzes TCP streams for attack observation without affecting production flow.
Packages & Containers¶
Choose packages or containers if you run on VMs or bare metal and prefer them over managed options.
Linux OS
Let Wallarm detect your OS version to install the appropriate modules
Docker
Use the NGINX-based or Native Node Docker image for Wallarm deployment
On-premise¶
Run the full stack (Nodes + Cloud) in your datacenter. Choose On-Premise for compliance, data residency, or internal policy.
Special setups¶
Deployment options that don’t follow the platform matrix: known scenarios (multi-tenant, separate postanalytics, custom NGINX) and custom requests when nothing above fits.
Multi-tenant Node deployment
Run Nodes for multiple tenants with per-account data and access isolation, ideal for SaaS and MSPs
Separate postanalytics
Deploy postanalytics on a dedicated host to scale independently and offload the Filtering Node
Custom NGINX version
Use Wallarm with a custom NGINX build when standard packages do not match your stack
Request custom deployment
Need something else? Request a custom deployment or integration