AppContext API reference¶
AppContext provides dependency injection for feature plugins, giving controlled access to Canvas-Chat APIs without exposing the entire App instance.
Class: AppContext¶
Container for Canvas-Chat application APIs, passed to feature plugins via constructor.
Constructor¶
Parameters:
app- Main App instance
Usage:
// In App.initializePluginSystem()
const appContext = new AppContext(this);
this.featureRegistry.setAppContext(appContext);
Note: App developers create AppContext. Plugin developers receive it via their constructor.
Properties¶
graph¶
CRDT-backed graph data structure managing nodes and edges.
Node operations¶
Add node:
const node = createNode(NodeType.NOTE, 'Content', { position: { x: 0, y: 0 } });
context.graph.addNode(node);
Get node:
const node = context.graph.getNode(nodeId);
if (node) {
console.log(node.content, node.type, node.position);
}
Update node:
Delete node:
Get all nodes:
Get leaf nodes:
Edge operations¶
Add edge:
Get edge:
Delete edge:
Get node edges:
Graph traversal¶
Resolve conversation context:
// Get message history for LLM
const contextNodes = context.graph.resolveContext([nodeId]);
// Returns: Array of nodes in conversation order
Get visible subtree:
Auto-positioning¶
Calculate position for new node:
const parentIds = context.canvas.getSelectedNodeIds();
const position = context.graph.autoPosition(parentIds);
// Returns: { x: number, y: number }
const newNode = createNode(NodeType.AI, 'Content', { position });
canvas¶
Visual canvas for rendering nodes and managing viewport.
Node rendering¶
Render node:
Remove node:
Update node content:
// While streaming
context.canvas.updateNodeContent(nodeId, partialContent, true);
// After completion
context.canvas.updateNodeContent(nodeId, finalContent, false);
Selection¶
Get selected nodes:
Clear selection:
Viewport control¶
Center on coordinates:
Pan to node:
Streaming controls¶
Show/hide stop button:
Show/hide continue button:
Edge rendering¶
Render edge:
const fromNode = context.graph.getNode(fromId);
const toNode = context.graph.getNode(toId);
context.canvas.renderEdge(edge, fromNode.position, toNode.position);
Remove edge:
chat¶
LLM communication interface with streaming support.
API credentials¶
Get API key for model:
Get base URL for model:
const baseUrl = context.chat.getBaseUrlForModel(model);
// Returns custom base URL or null (use default)
Send message (streaming)¶
const messages = [
{ role: 'system', content: 'You are a helpful assistant' },
{ role: 'user', content: 'Hello!' },
];
const model = context.modelPicker.value;
const abortController = new AbortController();
await context.chat.sendMessage(
messages,
model,
// onChunk: (chunk, fullContent) => void
(chunk, fullContent) => {
context.canvas.updateNodeContent(nodeId, fullContent, true);
},
// onDone: () => void
() => {
context.canvas.updateNodeContent(nodeId, finalContent, false);
console.log('Streaming complete');
},
// onError: (error) => void
(error) => {
console.error('Stream error:', error);
context.showToast?.(`Error: ${error.message}`, 'error');
},
// signal: AbortSignal (optional)
abortController.signal
);
// To abort:
// abortController.abort();
Summarize (non-streaming)¶
const messages = [{ role: 'user', content: 'Long text to summarize...' }];
const model = context.modelPicker.value;
const summary = await context.chat.summarize(messages, model);
console.log('Summary:', summary);
Token estimation¶
const text = 'Some text to analyze';
const model = context.modelPicker.value;
const tokens = context.chat.estimateTokens(text, model);
console.log('Estimated tokens:', tokens);
storage¶
LocalStorage wrapper for data persistence.
Key-value storage¶
Set item:
Get item:
Remove item:
Session management¶
Save session:
const session = {
id: 'session-123',
nodes: context.graph.getAllNodes(),
// ... other session data
};
context.storage.saveSession(session);
Get session:
API keys (read-only)¶
Get all API keys:
const apiKeys = context.storage.getApiKeys();
// Returns: Object<provider, apiKey>
// Example: { 'openai': 'sk-...', 'anthropic': 'sk-ant-...' }
Get Exa API key:
modelPicker¶
UI dropdown for model selection.
Get current model:
const model = context.modelPicker.value;
console.log('Selected model:', model); // e.g., 'openai/gpt-4'
Listen for changes:
context.modelPicker.addEventListener('change', (e) => {
console.log('Model changed to:', e.target.value);
});
featureRegistry¶
Plugin registry for accessing other plugins and emitting events.
Get other plugins¶
const otherPlugin = context.featureRegistry.getFeature('other-plugin-id');
if (otherPlugin) {
otherPlugin.someMethod();
}
Emit events¶
import { CanvasEvent } from '/static/js/plugin-events.js';
const event = new CanvasEvent('my-custom-event', {
data: 'some value',
});
context.featureRegistry.emit('my-custom-event', event);
Get registered commands¶
const commands = context.featureRegistry.getSlashCommands();
console.log('Available commands:', commands);
// Example: ['/committee', '/matrix', '/factcheck', ...]
app¶
Main application instance. Use sparingly - prefer specific APIs above.
When to use¶
Only use app for methods not available via other APIs:
Save session:
Update UI state:
Access modal manager:
Additional context properties¶
streamingNodes¶
Tracks active streaming operations.
Usage:
// Store streaming state
const abortController = new AbortController();
context.streamingNodes.set(nodeId, {
abortController,
model,
messages,
});
// Later, abort streaming
const state = context.streamingNodes.get(nodeId);
if (state) {
state.abortController.abort();
context.streamingNodes.delete(nodeId);
}
pyodideRunner¶
Python code execution engine (for code features).
Usage:
const result = await context.pyodideRunner.runCode(code, nodeId);
console.log('Output:', result.stdout);
console.log('HTML:', result.html);
console.log('Error:', result.error);
apiUrl¶
Base URL for backend API (for proxy/admin mode).
Usage:
const response = await fetch(`${context.apiUrl}/api/some-endpoint`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ data: 'value' }),
});
Helper functions¶
showToast()¶
Display toast notification.
Parameters:
message- Text to displaytype-'info','success','warning','error'
Usage:
Note: Use optional chaining (?.) as it may be undefined in tests.
Design rationale¶
Why AppContext?¶
Problem: Feature plugins need access to app functionality, but:
- Exposing entire
Appinstance creates tight coupling - Hard to test plugins in isolation
- Difficult to version/evolve APIs
Solution: AppContext provides:
✅ Controlled API surface: Only expose what plugins need ✅ Testability: Mock AppContext for isolated testing ✅ Versioning: Can add/deprecate APIs without breaking App internals ✅ Documentation: Clear contract between app and plugins
What NOT to put in AppContext¶
- Internal app state (e.g.,
this.selectedNodes) - UI component references (except essential ones like
modelPicker) - Methods that modify app structure (use events instead)
What TO put in AppContext¶
- Read-only state (e.g.,
streamingNodesfor coordination) - APIs with clear contracts (e.g.,
graph,canvas,chat) - Helper functions used by multiple plugins (e.g.,
showToast)
Example usage¶
Complete feature plugin using AppContext¶
import { FeaturePlugin } from '/static/js/feature-plugin.js';
import { NodeType, createNode } from '/static/js/graph-types.js';
class MyFeature extends FeaturePlugin {
constructor(context) {
super(context);
// Store context APIs
this.graph = context.graph;
this.canvas = context.canvas;
this.chat = context.chat;
this.storage = context.storage;
this.modelPicker = context.modelPicker;
// Initialize feature state
this.operations = new Map();
}
async onLoad() {
console.log('[MyFeature] Loaded');
// Load saved state
const saved = this.storage.getItem('my-feature-state');
if (saved) {
this.state = JSON.parse(saved);
}
}
async handleMyCommand(command, args, context) {
// Get selected nodes
const selectedIds = this.canvas.getSelectedNodeIds();
// Get model
const model = this.modelPicker.value;
// Create new node
const position = this.graph.autoPosition(selectedIds);
const node = createNode(NodeType.AI, '', { position, model });
// Add to graph
this.graph.addNode(node);
this.canvas.renderNode(node);
// Stream LLM response
const abortController = new AbortController();
this.operations.set(node.id, { abortController });
await this.chat.sendMessage(
[{ role: 'user', content: args }],
model,
(chunk, fullContent) => {
this.canvas.updateNodeContent(node.id, fullContent, true);
},
() => {
this.canvas.updateNodeContent(node.id, node.content, false);
this.operations.delete(node.id);
this.showToast?.('Complete', 'success');
},
(error) => {
this.showToast?.(`Error: ${error.message}`, 'error');
this.operations.delete(node.id);
},
abortController.signal
);
}
async onUnload() {
// Abort all operations
for (const [nodeId, op] of this.operations.entries()) {
op.abortController.abort();
}
this.operations.clear();
// Save state
this.storage.setItem('my-feature-state', JSON.stringify(this.state));
console.log('[MyFeature] Unloaded');
}
}
export { MyFeature };