Skip to main content

Map and Set Behavior

When serializing Map and Set objects with @lumenize/structured-clone, it's important to understand how keys and values behave after deserialization, especially when using objects as Map keys or Set values.

TL;DR

  • Primitive keys/values (strings, numbers, booleans) work as expected
  • ⚠️ Object as Map keys or Set values behavior depends on serialization boundaries:
    • Identity preserved within a single stringify()/parse() or RPC call
    • Identity lost across separate storage operations or multiple calls
  • 🔍 Best practice: return objects and their Map/Set together in same call/serialization

Primitive Keys - Work as Expected

Strings, Numbers, Booleans

Primitive values are compared by value, so they work exactly as you'd expect:

// Sender side
const map = new Map([
["user123", { name: "Alice" }],
[42, { count: 100 }],
[true, { active: true }]
]);

const serialized = stringify(map);

// Receiver side
const restored = parse(serialized);

expect(restored.get("user123")).toEqual({ name: "Alice" });
expect(restored.get(42)).toEqual({ count: 100 });
expect(restored.get(true)).toEqual({ active: true });

Why it works: JavaScript compares primitives by value, so "user123" === "user123" is always true.

Object Keys - Reconstructed but Different Identity

The Challenge

When you use an object as a Map key or Set value, JavaScript uses reference equality (===), not structural equality:

// Native JavaScript behavior
const key1 = { userId: 123 };
const key2 = { userId: 123 };

console.log(key1 === key2); // ❌ false - different objects!

const map = new Map();
map.set(key1, "value");
console.log(map.get(key1)); // ✅ "value"
console.log(map.get(key2)); // ❌ undefined - different reference!

After Serialization

The structured-clone package preserves this exact behavior. Object keys are fully reconstructed with all their properties, but they're new object instances:

// Sender side
const keyObj = { userId: 123, role: "admin" };
const map = new Map([[keyObj, "user data"]]);

const serialized = stringify(map);

// Receiver side
const restored = parse(serialized);

// The key is fully reconstructed...
const keys = Array.from(restored.keys());
expect(keys[0]).toEqual({ userId: 123, role: "admin" });

// All properties are preserved...
expect(keys[0].userId).toBe(123);
expect(keys[0].role).toBe("admin");

// BUT it's a new object with different identity...
const newKey = { userId: 123, role: "admin" };
expect(restored.get(newKey)).toBeUndefined();

// You must use the reconstructed key object:
expect(restored.get(keys[0])).toBe("user data");

Finding Object Keys After Deserialization

Since you can't construct a new object to look up values, you need to search through the reconstructed keys to find the one you want:

Pattern 1: Search by Property Match

// Sender side
const user1Key = { userId: 123, type: "user" };
const user2Key = { userId: 456, type: "user" };
const adminKey = { userId: 789, type: "admin" };

const map = new Map([
[user1Key, { name: "Alice", email: "alice@example.com" }],
[user2Key, { name: "Bob", email: "bob@example.com" }],
[adminKey, { name: "Admin", email: "admin@example.com" }]
]);

const serialized = stringify(map);

// Receiver side
const restored = parse(serialized);

// Find the key for userId 456
const targetKey = Array.from(restored.keys()).find(
key => key.userId === 456
);

if (targetKey) {
expect(restored.get(targetKey)).toEqual({
name: "Bob",
email: "bob@example.com"
});
}

Pattern 2: Store Keys Separately

The most reliable pattern is to explicitly share the key objects in your data structure:

// Sender side
const keyObj = { userId: 123, role: "admin" };

const data = {
map: new Map([[keyObj, "user data"]]),
keyToLookup: keyObj // ✅ Share the key explicitly!
};

const serialized = stringify(data);

// Receiver side
const restored = parse(serialized);

// Now you can access the map using the shared key!
expect(restored.map.get(restored.keyToLookup)).toBe("user data");

// The key references are preserved:
expect(restored.keyToLookup === Array.from(restored.map.keys())[0]).toBe(true);

Set Behavior with Objects

The same principles apply to Set - primitive values work as expected, while object values follow the same identity rules as Map keys:

// Primitive values work fine
const set = new Set(["apple", "banana", 123]);
const restored = parse(stringify(set));
expect(restored.has("apple")).toBe(true);
// ...

// Object values: can't use new objects, must search or store reference
const obj = { id: 1, name: "Alice" };
const data = { set: new Set([obj]), aliceRef: obj };
const restored2 = parse(stringify(data));
expect(restored2.set.has(restored2.aliceRef)).toBe(true);

Aliases and Identity Preservation

If the same object appears multiple times in your data structure, it's reconstructed as the same object reference:

const sharedKey = { category: "users" };
const map1 = new Map([[sharedKey, "data1"]]);
const map2 = new Map([[sharedKey, "data2"]]);
const data = { map1, map2, theKey: sharedKey };

const restored = parse(stringify(data));

// All references point to the same reconstructed object!
expect(restored.map1.get(restored.theKey)).toBe("data1");
expect(restored.map2.get(restored.theKey)).toBe("data2");

const key1 = Array.from(restored.map1.keys())[0];
const key2 = Array.from(restored.map2.keys())[0];
expect(key1 === key2 && key1 === restored.theKey).toBe(true);

RPC and Serialization Boundaries

The same identity preservation behavior applies when using @lumenize/rpc or @lumenize/mesh. Within a single RPC call, object identity is preserved:

// Server-side DO
export class MyDO extends LumenizeDO {
getData() {
const keyObj = { userId: 123 };
const map = new Map([[keyObj, "user data"]]);

// Return both the map and the key in the same response
return {
map,
key: keyObj // ✅ Identity preserved!
};
}
}

// Client-side
const stub = env.MY_DO.get(id);
const data = await stub.getData(); // Single RPC call

// The key reference works because it was returned together!
console.log(data.map.get(data.key)); // ✅ "user data"
console.log(data.key === Array.from(data.map.keys())[0]); // ✅ true

Understanding Serialization Boundaries

Identity preserved (single boundary):

  • ✅ Single RPC method call: await stub.method() returns object with map + key
  • ✅ Single stringify()/parse(): One serialization cycle
  • ✅ Native structuredClone(): One clone operation

Identity lost (separate boundaries):

  • ❌ Separate RPC calls: Call 1 returns map, Call 2 returns key
  • ❌ DO Storage: put(map) then later get() and use original key
  • ❌ Separate stringify() calls: Serialize map and key independently
// ❌ Won't work: Separate RPC calls
const map = await stub.getMap(); // Call 1
const key = await stub.getKey(); // Call 2
console.log(map.get(key)); // ❌ undefined - different boundaries!

// ✅ Will work: Single RPC call
const data = await stub.getMapAndKey(); // One call returns both
console.log(data.map.get(data.key)); // ✅ Works!

Storage vs RPC

The warning symbols (⚠️) in the type support documentation reflect these different boundaries:

SystemBoundaryObject Identity
@lumenize/structured-clonePer stringify()/parse()✅ Preserved
@lumenize/rpcPer RPC method call✅ Preserved
Workers RPCPer RPC method call✅ Preserved
DO StoragePer put()/get() operation❌ Lost

This is why DO Storage shows ⚠️ for Map/Set - you can't use the original key after storing and retrieving. But RPC preserves identity within a call!