Quick Start
📘 Doc-testing – Why do these examples look like tests?
This documentation uses testable code examples to ensure accuracy and reliability:
- Guaranteed accuracy: All examples are real, working code that runs against the actual library
- Always up-to-date: When the library changes, the tests fail and the docs must be updated
- Copy-paste confidence: What you see is what works - no outdated or broken examples
- Real-world patterns: Tests show complete, runnable scenarios, not just snippets
Ignore the test boilerplate (it()
, describe()
, etc.) - focus on the code inside.
Here's what minimal use of Lumenize RPC looks like.
test/quick-start.test.ts​
import { it, expect } from 'vitest';
// @ts-expect-error - cloudflare:test module types are not consistently exported
import { SELF } from 'cloudflare:test';
import {
createRpcClient,
RpcAccessible,
getWebSocketShim
} from '@lumenize/rpc';
import { Counter } from '../src/index';
type Counter = RpcAccessible<InstanceType<typeof Counter>>;
it('shows basic usage of Lumenize RPC', async () => {
await using client = createRpcClient<Counter>(
'COUNTER', // or 'counter' if you want pretty URLs
'test-counter',
// Since we're doc-testing in a vitest-pool-worker env, we need to provide
// this WebSocketClass, but you woudldn't in production
{ WebSocketClass: getWebSocketShim(SELF.fetch.bind(SELF)) }
);
// Test increment
const result1 = await client.increment();
expect(result1).toBe(1);
// Test again
const result2 = await client.increment();
expect(result2).toBe(2);
// Verify value in storage
const value = await client.ctx.storage.kv.get('count'); // await always required
expect(value).toBe(2);
});
To run the example above, put it in test/quick-start.test.ts
and perform the
following setup.
Installation​
First let's install some tools
npm install --save-dev vitest@3.2
npm install --save-dev @vitest/coverage-istanbul@3.2
npm install --save-dev @cloudflare/vitest-pool-workers
npm install --save-dev @lumenize/rpc
npm install --save-dev @lumenize/utils
src/index.ts​
Next add this Worker and Durable Object:
import { lumenizeRpcDO } from '@lumenize/rpc';
import { routeDORequest } from '@lumenize/utils';
import { DurableObject } from 'cloudflare:workers';
class _Counter extends DurableObject {
increment() {
let count: number = this.ctx.storage.kv.get('count') ?? 0;
count++;
this.ctx.storage.kv.put('count', count);
return count;
}
}
// Wrap with RPC support
export const Counter = lumenizeRpcDO(_Counter);
// Export a default worker to route RPC requests
export default {
async fetch(request: Request, env: any): Promise<Response> {
// Route RPC requests to the Durable Object. Works for https:// or wss://
// See: https://lumenize.com/docs/utils/route-do-request
const response = await routeDORequest(request, env, { prefix: '__rpc' });
if (response) return response;
// Fallback for non-RPC requests
return new Response('Not Found', { status: 404 });
},
};
wrangler.jsonc​
You wrangler
config should look something like this:
{
"name": "rpc-counter",
"main": "src/index.ts",
"compatibility_date": "2025-09-12",
"durable_objects": {
"bindings": [
{
"name": "COUNTER",
"class_name": "Counter"
}
]
},
"migrations": [
{
"tag": "v1",
"new_sqlite_classes": ["Counter"]
}
]
}
vitest.config.js​
Then add to your vite
config, if applicable, or create a vitest
config that
looks something like this:
import { defineWorkersProject } from "@cloudflare/vitest-pool-workers/config";
export default defineWorkersProject({
test: {
testTimeout: 2000, // 2 second global timeout
poolOptions: {
workers: {
// Must be false to use websockets. Have each test
// reference a different DO instance to avoid state sharing.
isolatedStorage: false,
// Important! use the wrangler.jsonc in ./test
wrangler: { configPath: "./wrangler.jsonc" },
},
},
// Use `vitest --run --coverage` to get test coverage report(s)
coverage: {
provider: "istanbul", // Cannot use V8
reporter: ['text', 'json', 'html'],
include: ['**/src/**'],
exclude: [
'**/node_modules/**',
'**/dist/**',
'**/build/**',
'**/*.config.ts',
'**/scratch/**'
],
},
},
});
Try it out​
To run it as a vitest:
vitest --run
You can even see how much of the code is covered by this "test":
vitest --run --coverage