Skip to content

Technology Stack โ€‹

WakeIQX is built on Cloudflare's edge computing platform with modern TypeScript tooling. Every technology choice prioritizes performance, developer experience, and semantic clarity.

Platform: Cloudflare Workers โ€‹

Why Cloudflare Workers?

WakeIQX runs on Cloudflare Workers, a serverless platform that executes code at the edge (in datacenters close to users).

Benefits:

  • โšก Ultra-low latency: Code runs in 300+ locations worldwide
  • ๐ŸŒ Global distribution: Automatic edge deployment
  • ๐Ÿ’ฐ Cost-effective: Pay per request, no idle costs
  • ๐Ÿ”’ Secure: Isolated V8 sandboxes
  • ๐Ÿ“ฆ No cold starts: Instant execution

Trade-offs:

  • โš ๏ธ CPU time limits (50ms on free tier)
  • โš ๏ธ Memory limits (128MB per request)
  • โš ๏ธ No filesystem access (must use KV/D1/R2)

Deployment:

bash
npm run deploy

Runtime: Workers Runtime (V8) โ€‹

Cloudflare Workers use the V8 JavaScript engine (same as Chrome/Node.js) with Web Standards APIs:

  • โœ… fetch() - HTTP requests
  • โœ… Request/Response - Web API objects
  • โœ… crypto - Web Crypto API
  • โœ… TextEncoder/TextDecoder - String encoding
  • โŒ No Node.js APIs (fs, path, http, etc.)

Example:

typescript
export default {
  async fetch(request: Request, env: Env): Promise<Response> {
    // Standard Web APIs
    const url = new URL(request.url);
    const body = await request.json();

    return new Response(JSON.stringify({ ok: true }), {
      headers: { 'Content-Type': 'application/json' }
    });
  }
};

Language: TypeScript 5.8 โ€‹

Why TypeScript?

  • ๐Ÿ” Type safety: Catch bugs at compile time
  • ๐Ÿ“– Self-documenting: Types serve as inline documentation
  • ๐Ÿงช Testability: Easier to mock and test
  • ๐Ÿ”ง Refactoring: Confident code changes
  • ๐ŸŽฏ Semantic Intent: Types carry meaning

Configuration (tsconfig.json):

json
{
  "compilerOptions": {
    "target": "ES2021",
    "module": "ESNext",
    "lib": ["ES2021"],
    "types": ["@cloudflare/workers-types"],
    "strict": true,
    "noEmit": true
  }
}

Strict mode enabled:

  • strictNullChecks: Prevent null/undefined errors
  • strictFunctionTypes: Type-safe function parameters
  • noImplicitAny: No implicit any types

Database: Cloudflare D1 โ€‹

Cloudflare D1 is SQLite running at the edge.

Features:

  • ๐Ÿ—ƒ๏ธ SQL: Standard SQL syntax (SQLite dialect)
  • ๐Ÿ”„ ACID transactions: Data integrity
  • ๐Ÿ“Š Relational: Foreign keys, joins, indexes
  • ๐ŸŒ Global replication: Automatic read replicas
  • ๐Ÿ’พ Persistent: Durable storage (not ephemeral like KV)

Why D1 over other databases?

  • โœ… Co-located with Workers (same edge location = low latency)
  • โœ… No connection pooling needed (Workers handle it)
  • โœ… Standard SQL (portable to Postgres/MySQL later)
  • โœ… Free tier: 100k rows, 25MB per database

Example query:

typescript
const result = await env.DB
  .prepare('SELECT * FROM context_snapshots WHERE project = ? ORDER BY timestamp DESC LIMIT ?')
  .bind(project, limit)
  .all();

const snapshots = result.results;

Connection:

typescript
interface Env {
  DB: D1Database;  // Automatically injected by Workers
}

AI: Cloudflare Workers AI โ€‹

Workers AI provides serverless access to LLMs at the edge.

Model: @cf/meta/llama-3.1-8b-instruct

  • ๐Ÿง  8 billion parameters: Fast and capable
  • ๐Ÿ’ฌ Instruct-tuned: Follows instructions well
  • โšก Edge inference: Runs in Workers environment
  • ๐Ÿ†“ Free tier: 10,000 neurons/day

Usage in WakeIQX:

typescript
const response = await env.AI.run(
  '@cf/meta/llama-3.1-8b-instruct',
  {
    messages: [
      { role: 'system', content: 'You are a helpful assistant...' },
      { role: 'user', content: 'Summarize this conversation...' }
    ]
  }
);

const summary = response.response;

What we use AI for:

  • ๐Ÿ“ Generate context summaries (compress conversation)
  • ๐Ÿท๏ธ Extract semantic tags (for search)
  • ๐ŸŽฏ Rationale generation (explain WHY context was saved)

Protocol: Model Context Protocol (MCP) โ€‹

MCP is an open protocol for connecting AI agents to external data sources.

MCP SDK: @modelcontextprotocol/sdk v1.19.1

Key concepts:

  • Tools: Functions AI agents can call
  • Resources: Data AI agents can access
  • Prompts: Pre-defined conversation starters

WakeIQX exposes 12 MCP tools:

  • 3 Core tools (save, load, search)
  • 3 Layer 1 tools (causality tracking)
  • 3 Layer 2 tools (memory management)
  • 3 Layer 3 tools (prediction scoring)

Example tool definition:

typescript
{
  name: "save_context",
  description: "Save conversation context with AI enhancement",
  inputSchema: {
    type: "object",
    properties: {
      project: { type: "string", description: "Project identifier" },
      content: { type: "string", description: "Context content to save" }
    },
    required: ["project", "content"]
  }
}

Transport: HTTP with Server-Sent Events (SSE)


Development Tools โ€‹

Build Tool: Wrangler โ€‹

Wrangler is Cloudflare's CLI for Workers development.

Key commands:

bash
npm run dev        # Start local dev server
npm run deploy     # Deploy to Cloudflare
npm run cf-typegen # Generate TypeScript types

Configuration (wrangler.toml):

toml
name = "semantic-wake-intelligence-mcp"
main = "src/index.ts"
compatibility_date = "2024-10-01"

[[d1_databases]]
binding = "DB"
database_name = "wake-intelligence"
database_id = "..."

[ai]
binding = "AI"

Code Quality: Biome โ€‹

Biome replaces ESLint + Prettier with a single fast tool.

Features:

  • ๐ŸŽจ Formatting: Code style (like Prettier)
  • ๐Ÿ” Linting: Code quality (like ESLint)
  • โšก Fast: Written in Rust (100x faster)
  • ๐Ÿ”ง Auto-fix: Automatically fix issues

Commands:

bash
npm run format    # Format code
npm run lint:fix  # Fix linting issues

Configuration (biome.json):

json
{
  "formatter": {
    "enabled": true,
    "indentStyle": "space",
    "indentWidth": 2
  },
  "linter": {
    "enabled": true,
    "rules": {
      "recommended": true
    }
  }
}

Testing: Vitest โ€‹

Vitest is a fast, modern test framework.

Features:

  • โšก Fast: Native ESM, parallel execution
  • ๐Ÿ”„ Watch mode: Auto-rerun on changes
  • ๐Ÿ“Š Coverage: Built-in code coverage
  • ๐ŸŽจ UI: Visual test runner

Commands:

bash
npm test              # Run all tests
npm run test:watch    # Watch mode
npm run test:ui       # Visual UI
npm run test:coverage # Coverage report

Example test:

typescript
import { describe, it, expect } from 'vitest';
import { CausalityService } from './CausalityService';

describe('CausalityService', () => {
  it('builds causal chain correctly', async () => {
    const mockRepo = createMockRepository();
    const service = new CausalityService(mockRepo);

    const chain = await service.buildCausalChain('snapshot-id');

    expect(chain).toHaveLength(3);
    expect(chain[0].snapshot.id).toBe('root-id');
  });
});

Type Generation: wrangler types โ€‹

Automatically generate TypeScript types for Workers bindings:

bash
npm run cf-typegen

Generated (worker-configuration.d.ts):

typescript
interface Env {
  DB: D1Database;
  AI: Ai;
}

Dependencies โ€‹

Production Dependencies โ€‹

PackageVersionPurpose
@modelcontextprotocol/sdk1.19.1MCP protocol implementation
@cloudflare/agents0.0.16Cloudflare agents framework
zod3.25.76Runtime type validation

Why Zod?

  • Validates API inputs at runtime
  • TypeScript types from schema
  • User-friendly error messages

Example:

typescript
import { z } from 'zod';

const SaveContextInput = z.object({
  project: z.string().min(1),
  content: z.string().min(1),
  source: z.string().optional(),
  metadata: z.record(z.unknown()).optional()
});

type SaveContextInput = z.infer<typeof SaveContextInput>;

// Validate at runtime
const input = SaveContextInput.parse(data);

Development Dependencies โ€‹

PackageVersionPurpose
typescript5.8.3TypeScript compiler
wrangler4.42.1Cloudflare CLI
vitest3.2.4Test framework
@biomejs/biome2.2.5Linting & formatting
@cloudflare/vitest-pool-workers0.9.11Test Workers locally
@vitest/coverage-v83.2.4Code coverage

Package Manager: npm โ€‹

WakeIQX uses npm (Node Package Manager):

bash
npm install           # Install dependencies
npm run dev           # Start dev server
npm run deploy        # Deploy to production
npm test              # Run tests

Why npm over yarn/pnpm?

  • โœ… Default for most projects
  • โœ… Works everywhere
  • โœ… No extra installation needed

Version Control: Git + GitHub โ€‹

Repository: semanticintent/semantic-wake-intelligence-mcp

Branching strategy:

  • main: Production branch
  • Feature branches: feature/layer-3-propagation
  • Release tags: v3.0.0

Commit conventions:

feat: Add Layer 3 Propagation Engine
fix: Correct memory tier calculation
docs: Update architecture documentation
test: Add causality chain tests

Architecture Patterns โ€‹

Hexagonal Architecture (Ports & Adapters) โ€‹

See Hexagonal Architecture โ†’

Layers:

  • Domain (core business logic)
  • Application (orchestration)
  • Infrastructure (D1, AI, HTTP)
  • Presentation (HTTP endpoints)

Semantic Intent Pattern โ€‹

See Semantic Intent Pattern โ†’

Every file/function includes:

typescript
/**
 * ๐ŸŽฏ SEMANTIC INTENT: [What this means]
 *
 * PURPOSE: [Why it exists]
 *
 * RESPONSIBILITY: [What it does]
 */

Performance Optimizations โ€‹

1. Edge Execution โ€‹

Code runs in 300+ datacenters worldwide:

  • User in Tokyo โ†’ Runs in Tokyo
  • User in London โ†’ Runs in London
  • Result: 10-50ms latency globally

2. Database Indexes โ€‹

All common queries have indexes:

sql
CREATE INDEX idx_project_timestamp ON context_snapshots(project, timestamp);

Result: 10x faster queries

3. Batch Operations โ€‹

Memory tier recalculation processes 100 contexts per batch:

typescript
const batch = contexts.slice(0, 100);
await Promise.all(batch.map(ctx => updateTier(ctx)));

Result: Parallel processing

4. Prediction Caching โ€‹

Predictions refresh only when stale (> 24 hours):

typescript
if (isPredictionStale(context.last_predicted, 24)) {
  await recalculatePrediction(context);
}

Result: Avoid unnecessary AI calls


Monitoring & Observability โ€‹

Cloudflare Analytics โ€‹

Built-in metrics:

  • Request count
  • Error rate
  • Response time (p50, p95, p99)
  • CPU usage

Logging โ€‹

Standard console.log:

typescript
console.log('Context saved:', { id, project });
console.error('Failed to save:', error);

Logs visible in:

  • wrangler tail (live logs)
  • Cloudflare dashboard
  • Real-time streaming

Security โ€‹

1. CORS Middleware โ€‹

Controlled cross-origin access:

typescript
headers: {
  'Access-Control-Allow-Origin': '*',
  'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
  'Access-Control-Allow-Headers': 'Content-Type'
}

2. Input Validation โ€‹

Zod schemas validate all inputs:

typescript
const input = SaveContextInput.parse(data);
// Throws if invalid

3. SQL Injection Prevention โ€‹

Prepared statements with parameter binding:

typescript
// โœ… Safe
db.prepare('SELECT * FROM contexts WHERE id = ?').bind(id);

// โŒ Vulnerable (not used)
db.prepare(`SELECT * FROM contexts WHERE id = '${id}'`);

4. Isolated Execution โ€‹

Each request runs in isolated V8 sandbox:

  • No shared memory between requests
  • No filesystem access
  • No network access (except via Workers APIs)

Deployment Pipeline โ€‹

1. Code commit to GitHub
     โ†“
2. Run tests locally (npm test)
     โ†“
3. Type check (npm run type-check)
     โ†“
4. Format & lint (npm run format && npm run lint:fix)
     โ†“
5. Deploy to Cloudflare (npm run deploy)
     โ†“
6. Automatic edge distribution (300+ locations)
     โ†“
7. Monitor via Cloudflare Analytics

Manual deployment:

bash
npm run deploy

Future: GitHub Actions CI/CD for automatic deployment


Local Development โ€‹

bash
# 1. Install dependencies
npm install

# 2. Start local dev server
npm run dev

# Server runs on http://localhost:8787

# 3. Test MCP endpoint
curl http://localhost:8787/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","method":"tools/list","id":1}'

# 4. Watch for changes (auto-reload)
# Edit files โ†’ Wrangler automatically rebuilds

Local bindings:

  • env.DB: Local SQLite database (.wrangler/state/v3/d1)
  • env.AI: Remote Workers AI (uses Cloudflare API)

Migration from Node.js โ€‹

WakeIQX is designed to be portable. To run on Node.js instead of Workers:

  1. Replace infrastructure adapters:

    • D1ContextRepository โ†’ PostgresRepository
    • Workers AI โ†’ OpenAI API
  2. Change entry point:

    • export default { fetch } โ†’ app.listen(3000)
  3. Update dependencies:

    • Remove @cloudflare/* packages
    • Add express or fastify

Domain layer stays the same! (Hexagonal architecture benefit)


Further Reading โ€‹