Lead Forward-Deployed Infrastructure Engineer (Air-Gapped Systems)
Take our AI governance control plane across the air gap — package, deploy, and maintain sovereign Kenshiki appliances on bare-metal hardware inside secure enclaves.
Stack: Bare-metal Linux, Kubernetes/Docker, PostgreSQL, local LLM inference (vLLM/Ollama), zero-trust networking
Why this role exists
Our target clients — defense, financial institutions, secure government — operate in zero-trust environments where sending data to a third-party API is a compliance violation. They need Kenshiki's three-plane architecture — the bounded-synthesis pipeline, SIRE identity enforcement, and governance-grade telemetry — running on their hardware, inside their network, with no external dependencies. We've built the engine. You take it across the air gap and make it run.
What you'll do
- Package for air-gapped deployment. Bundle the full TypeScript/PostgreSQL stack — Kura evidence stores, Claim Ledger persistence, Boundary Gate runtime — alongside open-weight LLMs and embedding models into immutable offline artifacts. Docker tarballs, Helm charts, everything needed to install where npm install and docker pull don't exist.
- Provision bare-metal hardware. Configure client servers — Linux kernel tuning, NVIDIA drivers, Kadai inference engine optimization (vLLM, Ollama) — to maximize utilization without external dependencies.
- Secure the database. Deploy PostgreSQL with encryption at rest (FIPS 140-2), integrate with client IAM systems, and meet enterprise compliance standards for data handling.
- Represent Kenshiki on-site. Work directly with client IT and security officers to navigate firewall rules, zero-trust architectures, and compliance audits. You are the face of our engineering in their facility.
What we're looking for
- Linux & networking depth: Bare-metal provisioning, complex networking topologies. You debug with raw logs, not cloud dashboards.
- Offline containerization: Expert Docker and Kubernetes in on-premise, disconnected environments.
- Hardware literacy: VRAM constraints, PCIe bandwidth, and how to physically optimize an inference server (L4, H100, or equivalent) for maximum throughput on large language models.
- Security clearance: Experience in regulated or classified environments. Active or recent DoD clearance is a strong plus; eligibility for clearance is required.
- Independence: You can fly to a client site, assess their infrastructure, and solve the deployment problems without waiting for instructions.
Why Kenshiki?
While most of the industry builds wrappers around third-party APIs, you'll deploy fully sovereign, governed AI systems — SIRE identity enforcement, Boundary Gate verification, Claim Ledger provenance — into the most secure facilities that exist. The Clean Room environment was designed for exactly this.