Technology Stack โ
WakeIQX is built on Cloudflare's edge computing platform with modern TypeScript tooling. Every technology choice prioritizes performance, developer experience, and semantic clarity.
Platform: Cloudflare Workers โ
Why Cloudflare Workers?
WakeIQX runs on Cloudflare Workers, a serverless platform that executes code at the edge (in datacenters close to users).
Benefits:
- โก Ultra-low latency: Code runs in 300+ locations worldwide
- ๐ Global distribution: Automatic edge deployment
- ๐ฐ Cost-effective: Pay per request, no idle costs
- ๐ Secure: Isolated V8 sandboxes
- ๐ฆ No cold starts: Instant execution
Trade-offs:
- โ ๏ธ CPU time limits (50ms on free tier)
- โ ๏ธ Memory limits (128MB per request)
- โ ๏ธ No filesystem access (must use KV/D1/R2)
Deployment:
npm run deployRuntime: Workers Runtime (V8) โ
Cloudflare Workers use the V8 JavaScript engine (same as Chrome/Node.js) with Web Standards APIs:
- โ
fetch()- HTTP requests - โ
Request/Response- Web API objects - โ
crypto- Web Crypto API - โ
TextEncoder/TextDecoder- String encoding - โ No Node.js APIs (
fs,path,http, etc.)
Example:
export default {
async fetch(request: Request, env: Env): Promise<Response> {
// Standard Web APIs
const url = new URL(request.url);
const body = await request.json();
return new Response(JSON.stringify({ ok: true }), {
headers: { 'Content-Type': 'application/json' }
});
}
};Language: TypeScript 5.8 โ
Why TypeScript?
- ๐ Type safety: Catch bugs at compile time
- ๐ Self-documenting: Types serve as inline documentation
- ๐งช Testability: Easier to mock and test
- ๐ง Refactoring: Confident code changes
- ๐ฏ Semantic Intent: Types carry meaning
Configuration (tsconfig.json):
{
"compilerOptions": {
"target": "ES2021",
"module": "ESNext",
"lib": ["ES2021"],
"types": ["@cloudflare/workers-types"],
"strict": true,
"noEmit": true
}
}Strict mode enabled:
strictNullChecks: Prevent null/undefined errorsstrictFunctionTypes: Type-safe function parametersnoImplicitAny: No implicitanytypes
Database: Cloudflare D1 โ
Cloudflare D1 is SQLite running at the edge.
Features:
- ๐๏ธ SQL: Standard SQL syntax (SQLite dialect)
- ๐ ACID transactions: Data integrity
- ๐ Relational: Foreign keys, joins, indexes
- ๐ Global replication: Automatic read replicas
- ๐พ Persistent: Durable storage (not ephemeral like KV)
Why D1 over other databases?
- โ Co-located with Workers (same edge location = low latency)
- โ No connection pooling needed (Workers handle it)
- โ Standard SQL (portable to Postgres/MySQL later)
- โ Free tier: 100k rows, 25MB per database
Example query:
const result = await env.DB
.prepare('SELECT * FROM context_snapshots WHERE project = ? ORDER BY timestamp DESC LIMIT ?')
.bind(project, limit)
.all();
const snapshots = result.results;Connection:
interface Env {
DB: D1Database; // Automatically injected by Workers
}AI: Cloudflare Workers AI โ
Workers AI provides serverless access to LLMs at the edge.
Model: @cf/meta/llama-3.1-8b-instruct
- ๐ง 8 billion parameters: Fast and capable
- ๐ฌ Instruct-tuned: Follows instructions well
- โก Edge inference: Runs in Workers environment
- ๐ Free tier: 10,000 neurons/day
Usage in WakeIQX:
const response = await env.AI.run(
'@cf/meta/llama-3.1-8b-instruct',
{
messages: [
{ role: 'system', content: 'You are a helpful assistant...' },
{ role: 'user', content: 'Summarize this conversation...' }
]
}
);
const summary = response.response;What we use AI for:
- ๐ Generate context summaries (compress conversation)
- ๐ท๏ธ Extract semantic tags (for search)
- ๐ฏ Rationale generation (explain WHY context was saved)
Protocol: Model Context Protocol (MCP) โ
MCP is an open protocol for connecting AI agents to external data sources.
MCP SDK: @modelcontextprotocol/sdk v1.19.1
Key concepts:
- Tools: Functions AI agents can call
- Resources: Data AI agents can access
- Prompts: Pre-defined conversation starters
WakeIQX exposes 12 MCP tools:
- 3 Core tools (save, load, search)
- 3 Layer 1 tools (causality tracking)
- 3 Layer 2 tools (memory management)
- 3 Layer 3 tools (prediction scoring)
Example tool definition:
{
name: "save_context",
description: "Save conversation context with AI enhancement",
inputSchema: {
type: "object",
properties: {
project: { type: "string", description: "Project identifier" },
content: { type: "string", description: "Context content to save" }
},
required: ["project", "content"]
}
}Transport: HTTP with Server-Sent Events (SSE)
Development Tools โ
Build Tool: Wrangler โ
Wrangler is Cloudflare's CLI for Workers development.
Key commands:
npm run dev # Start local dev server
npm run deploy # Deploy to Cloudflare
npm run cf-typegen # Generate TypeScript typesConfiguration (wrangler.toml):
name = "semantic-wake-intelligence-mcp"
main = "src/index.ts"
compatibility_date = "2024-10-01"
[[d1_databases]]
binding = "DB"
database_name = "wake-intelligence"
database_id = "..."
[ai]
binding = "AI"Code Quality: Biome โ
Biome replaces ESLint + Prettier with a single fast tool.
Features:
- ๐จ Formatting: Code style (like Prettier)
- ๐ Linting: Code quality (like ESLint)
- โก Fast: Written in Rust (100x faster)
- ๐ง Auto-fix: Automatically fix issues
Commands:
npm run format # Format code
npm run lint:fix # Fix linting issuesConfiguration (biome.json):
{
"formatter": {
"enabled": true,
"indentStyle": "space",
"indentWidth": 2
},
"linter": {
"enabled": true,
"rules": {
"recommended": true
}
}
}Testing: Vitest โ
Vitest is a fast, modern test framework.
Features:
- โก Fast: Native ESM, parallel execution
- ๐ Watch mode: Auto-rerun on changes
- ๐ Coverage: Built-in code coverage
- ๐จ UI: Visual test runner
Commands:
npm test # Run all tests
npm run test:watch # Watch mode
npm run test:ui # Visual UI
npm run test:coverage # Coverage reportExample test:
import { describe, it, expect } from 'vitest';
import { CausalityService } from './CausalityService';
describe('CausalityService', () => {
it('builds causal chain correctly', async () => {
const mockRepo = createMockRepository();
const service = new CausalityService(mockRepo);
const chain = await service.buildCausalChain('snapshot-id');
expect(chain).toHaveLength(3);
expect(chain[0].snapshot.id).toBe('root-id');
});
});Type Generation: wrangler types โ
Automatically generate TypeScript types for Workers bindings:
npm run cf-typegenGenerated (worker-configuration.d.ts):
interface Env {
DB: D1Database;
AI: Ai;
}Dependencies โ
Production Dependencies โ
| Package | Version | Purpose |
|---|---|---|
@modelcontextprotocol/sdk | 1.19.1 | MCP protocol implementation |
@cloudflare/agents | 0.0.16 | Cloudflare agents framework |
zod | 3.25.76 | Runtime type validation |
Why Zod?
- Validates API inputs at runtime
- TypeScript types from schema
- User-friendly error messages
Example:
import { z } from 'zod';
const SaveContextInput = z.object({
project: z.string().min(1),
content: z.string().min(1),
source: z.string().optional(),
metadata: z.record(z.unknown()).optional()
});
type SaveContextInput = z.infer<typeof SaveContextInput>;
// Validate at runtime
const input = SaveContextInput.parse(data);Development Dependencies โ
| Package | Version | Purpose |
|---|---|---|
typescript | 5.8.3 | TypeScript compiler |
wrangler | 4.42.1 | Cloudflare CLI |
vitest | 3.2.4 | Test framework |
@biomejs/biome | 2.2.5 | Linting & formatting |
@cloudflare/vitest-pool-workers | 0.9.11 | Test Workers locally |
@vitest/coverage-v8 | 3.2.4 | Code coverage |
Package Manager: npm โ
WakeIQX uses npm (Node Package Manager):
npm install # Install dependencies
npm run dev # Start dev server
npm run deploy # Deploy to production
npm test # Run testsWhy npm over yarn/pnpm?
- โ Default for most projects
- โ Works everywhere
- โ No extra installation needed
Version Control: Git + GitHub โ
Repository: semanticintent/semantic-wake-intelligence-mcp
Branching strategy:
main: Production branch- Feature branches:
feature/layer-3-propagation - Release tags:
v3.0.0
Commit conventions:
feat: Add Layer 3 Propagation Engine
fix: Correct memory tier calculation
docs: Update architecture documentation
test: Add causality chain testsArchitecture Patterns โ
Hexagonal Architecture (Ports & Adapters) โ
See Hexagonal Architecture โ
Layers:
- Domain (core business logic)
- Application (orchestration)
- Infrastructure (D1, AI, HTTP)
- Presentation (HTTP endpoints)
Semantic Intent Pattern โ
See Semantic Intent Pattern โ
Every file/function includes:
/**
* ๐ฏ SEMANTIC INTENT: [What this means]
*
* PURPOSE: [Why it exists]
*
* RESPONSIBILITY: [What it does]
*/Performance Optimizations โ
1. Edge Execution โ
Code runs in 300+ datacenters worldwide:
- User in Tokyo โ Runs in Tokyo
- User in London โ Runs in London
- Result: 10-50ms latency globally
2. Database Indexes โ
All common queries have indexes:
CREATE INDEX idx_project_timestamp ON context_snapshots(project, timestamp);Result: 10x faster queries
3. Batch Operations โ
Memory tier recalculation processes 100 contexts per batch:
const batch = contexts.slice(0, 100);
await Promise.all(batch.map(ctx => updateTier(ctx)));Result: Parallel processing
4. Prediction Caching โ
Predictions refresh only when stale (> 24 hours):
if (isPredictionStale(context.last_predicted, 24)) {
await recalculatePrediction(context);
}Result: Avoid unnecessary AI calls
Monitoring & Observability โ
Cloudflare Analytics โ
Built-in metrics:
- Request count
- Error rate
- Response time (p50, p95, p99)
- CPU usage
Logging โ
Standard console.log:
console.log('Context saved:', { id, project });
console.error('Failed to save:', error);Logs visible in:
wrangler tail(live logs)- Cloudflare dashboard
- Real-time streaming
Security โ
1. CORS Middleware โ
Controlled cross-origin access:
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type'
}2. Input Validation โ
Zod schemas validate all inputs:
const input = SaveContextInput.parse(data);
// Throws if invalid3. SQL Injection Prevention โ
Prepared statements with parameter binding:
// โ
Safe
db.prepare('SELECT * FROM contexts WHERE id = ?').bind(id);
// โ Vulnerable (not used)
db.prepare(`SELECT * FROM contexts WHERE id = '${id}'`);4. Isolated Execution โ
Each request runs in isolated V8 sandbox:
- No shared memory between requests
- No filesystem access
- No network access (except via Workers APIs)
Deployment Pipeline โ
1. Code commit to GitHub
โ
2. Run tests locally (npm test)
โ
3. Type check (npm run type-check)
โ
4. Format & lint (npm run format && npm run lint:fix)
โ
5. Deploy to Cloudflare (npm run deploy)
โ
6. Automatic edge distribution (300+ locations)
โ
7. Monitor via Cloudflare AnalyticsManual deployment:
npm run deployFuture: GitHub Actions CI/CD for automatic deployment
Local Development โ
# 1. Install dependencies
npm install
# 2. Start local dev server
npm run dev
# Server runs on http://localhost:8787
# 3. Test MCP endpoint
curl http://localhost:8787/mcp \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","method":"tools/list","id":1}'
# 4. Watch for changes (auto-reload)
# Edit files โ Wrangler automatically rebuildsLocal bindings:
env.DB: Local SQLite database (.wrangler/state/v3/d1)env.AI: Remote Workers AI (uses Cloudflare API)
Migration from Node.js โ
WakeIQX is designed to be portable. To run on Node.js instead of Workers:
Replace infrastructure adapters:
D1ContextRepositoryโPostgresRepository- Workers AI โ OpenAI API
Change entry point:
export default { fetch }โapp.listen(3000)
Update dependencies:
- Remove
@cloudflare/*packages - Add
expressorfastify
- Remove
Domain layer stays the same! (Hexagonal architecture benefit)
