Agents, Workers – @cloudflare/codemode v0.1.0: a new runtime agnostic modular architecture

Agents, Workers – @cloudflare/codemode v0.1.0: a new runtime agnostic modular architecture

The @cloudflare/codemode package has been rewritten into a modular, runtime-agnostic SDK.

Code Mode enables LLMs to write and execute TypeScript that orchestrates your tools, instead of calling them one at a time. This can (and does) yield significant token savings, reduces context window pressure and improves overall model performance on a task.

The new Executor interface is runtime agnostic and comes with a prebuilt DynamicWorkerExecutor to run generated code in a Dynamic Worker Loader.

Breaking changes

  • Removed experimental_codemode() and CodeModeProxy — the package no longer owns an LLM call or model choice
  • New import path: createCodeTool() is now exported from @cloudflare/codemode/ai

New features

  • createCodeTool() — Returns a standard AI SDK Tool to use in your AI agents.
  • Executor interface — Minimal execute(code, fns) contract. Implement for any code sandboxing primitive or runtime.

DynamicWorkerExecutor

Runs code in a Dynamic Worker. It comes with the following features:

  • Network isolationfetch() and connect() blocked by default (globalOutbound: null) when using DynamicWorkerExecutor
  • Console captureconsole.log/warn/error captured and returned in ExecuteResult.logs
  • Execution timeout — Configurable via timeout option (default 30s)

Usage

  • JavaScript

    import { createCodeTool } from "@cloudflare/codemode/ai";
    import { DynamicWorkerExecutor } from "@cloudflare/codemode";
    import { streamText } from "ai";
    const executor = new DynamicWorkerExecutor({ loader: env.LOADER });
    const codemode = createCodeTool({ tools: myTools, executor });
    const result = streamText({
    model,
    tools: { codemode },
    messages,
    });
  • TypeScript

    import { createCodeTool } from "@cloudflare/codemode/ai";
    import { DynamicWorkerExecutor } from "@cloudflare/codemode";
    import { streamText } from "ai";
    const executor = new DynamicWorkerExecutor({ loader: env.LOADER });
    const codemode = createCodeTool({ tools: myTools, executor });
    const result = streamText({
    model,
    tools: { codemode },
    messages,
    });

Wrangler configuration

  • wrangler.jsonc

    {
    "worker_loaders": [{ "binding": "LOADER" }],
    }
  • wrangler.toml

    [[worker_loaders]]
    binding = "LOADER"

See the Code Mode documentation for full API reference and examples.

Upgrade

npm i @cloudflare/codemode@latest

Source: Cloudflare



Latest Posts

Pass It On
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *