Marco Patzelt
Back to Overview
January 2, 2026

The "Runtime Compiler": Using AI to generate "Last-Mile" Logic

Traditional software development is like laying rigid pipes. I explain why I no longer build static endpoints but instead architect Agents (powered by Gemini 3 Pro) that write integration code at runtime. The shift is from "Weeks to Insight" to "Seconds to Answer".

The "Runtime Compiler": Using AI to Generate "Last-Mile" Logic

The Efficiency Gap in Static Architecture

I work frequently with Stability-Focused Architects who operate under a maxim I generally agree with: Predictability is the highest form of engineering.

It is reasonable—and often necessary—for an enterprise to lock down core business logic into rigid, statically typed APIs. You do not want a creative algorithm handling your payment gateway or core mutation logic. Those "pipes" must be laid in concrete to ensure data integrity and security.

However, there is a trade-off. While this rigidity protects the core, it creates an Efficiency Gap at the edge.

Business logic, specifically regarding analysis and reporting, evolves much faster than deployment cycles allow. In the traditional model, every new business question requires a developer to act as a data conduit:

  1. Business needs a specific revenue slice.
  2. Engineering builds GET /api/revenue-slice.
  3. Engineering updates the ETL pipeline.
  4. Engineering deploys a dashboard widget.

By the time this ticket is closed, the business context has often shifted. We are effectively hardcoding answers to questions asked two weeks ago. This is where the "Runtime Compiler" approach offers a strategic alternative.

The Solution: A Hybrid "Last-Mile" Layer

I propose a hybrid architecture. We keep the core transaction layer hardcoded for safety, but we implement an AI-driven Runtime Logic Layer for read-only analysis and dynamic reporting.

Instead of cementing every possible analytical question into a TypeScript function, we use Large Language Models (LLMs) as a Just-in-Time (JIT) compiler.

The Workflow:

  1. Intent: The user asks a complex, non-standard question.
  2. Compilation: The Agent, having access to schema definitions (not raw data), generates the necessary SQL or API glue code in real-time.
  3. Sandboxing: This code is executed in a strictly isolated, read-only environment (e.g., a Vercel Edge Function or a database read-replica).
  4. Synthesis: The system returns the answer, not just a raw dataset.

Defining the Boundary: Mitigating Reliability Risk

The immediate concern for any Infrastructure Manager is obvious: "You want an LLM to write code in production? That sounds dangerous."

This fear is valid if the boundaries are not defined. To make this "Special Ops" ready, we must apply strict Guardrails:

  1. Read-Only Principle: The Runtime Compiler is never granted WRITE, UPDATE, or DELETE permissions. It is an analytical tool, not a transactional one.
  2. Sandboxed Execution: The generated code runs in ephemeral containers that die immediately after execution.
  3. Human-in-the-Loop Validation: For recurring queries, the AI-generated logic can be "pinned" and reviewed by a senior engineer, essentially automating the drafting phase of API development.

This is not about replacing the engineer; it is about automating the "Last-Mile" of logic that is too transient to warrant a full deployment cycle.

The "War Room" Test: A Strategic Comparison

To visualize the trade-off, consider a sudden supply chain crisis, such as a blockage in the Suez Canal.

The Legacy Habit (Static Dashboards): The CEO enters the War Room asking for the revenue impact of specific delayed containers. Standard dashboards are designed for historical trends, not geopolitical anomalies. The data exists, but it is siloed.

  • Process: Developers must manually extract ship manifests, cross-reference them with SQL order tables, and calculate margins in a spreadsheet.
  • Latency: 24–48 hours.
  • Result: Decisions are made based on stale data or estimation.

The Runtime Compiler Approach: The executive queries the internal interface: "Match the manifest of the 'Ever Given' against our pending orders and calculate Q3 revenue risk."

The Agent acts as the architect of the moment:

  1. Fetch: Calls the logistics API for the manifest.
  2. Join: Generates a temporary SQL query to join those IDs against the internal orders table.
  3. Calculate: Sums the value of impacted rows.
  • Latency: 30 seconds.
  • Result: "14 containers identified. Total value: €4.2M. Q3 Risk: -2.1%."

The Verdict: Architecture for the Unknown

We must stop treating every analytical question as a software engineering project.

The role of the Architect is evolving. It is no longer just about building the most robust warehouse; it is about how quickly you can retrieve items from it when the lights go out.

My recommendation to Technical Leaders: Maintain your rigorous standards for the write-path. But for the read-path, stop building static endpoints for dynamic problems. Implement an orchestration layer that allows your data to answer questions you haven't thought of yet.

The most resilient systems of the future will be those that blend the security of hard code with the adaptability of runtime logic.

Marco Patzelt High-End Software Architect

Let's
connect.

I am always open to exciting discussions about frontend architecture, performance, and modern web stacks.

Email me
Email me