Introduction
Seasoned is a modern recipe management application built with a microservices architecture using Cloudflare Workers. The application provides recipe clipping, storage, search, and management capabilities through a collection of specialized workers.
🔄 Auto-Update Information
This documentation is automatically generated from your source code. When you update your workers, run the generation script to update the docs:
npm run docs:generate
Last Generated: 2025-08-16T00:59:53.310Z
🎯 Recipe Clipping
AI-powered recipe extraction from any recipe website
🔍 Graph Search
Advanced recipe search with ingredient and tag relationships
💾 Multi-Storage
Hybrid storage using KV, D1, and R2 for optimal performance
🚀 Edge Computing
Global deployment with Cloudflare Workers for low latency
Architecture Overview
Frontend (React + Vite)
Modern web interface for recipe management
API Gateway & Workers
- Recipe Search DB: Graph Search, FTS, Node/Edge Management
- Clipped Recipe DB: Recipe CRUD, Image Uploads, Health Monitoring
- Recipe Scraper: JSON-LD Extraction, Batch Processing, KV Storage
- Recipe Clipper: AI Extraction, GPT-4o-mini, Smart Caching
- Recipe Crawler: Batch Processing, Health Monitoring, Progress Tracking
Storage Layer
- Cloudflare KV: Caching and temporary storage
- Cloudflare D1: SQLite database for recipes
- Cloudflare R2: Image storage
Recipe Search DB Worker
Graph Search, FTS, Node/Edge Management
📍 Location
recipe-search-db/
🗄️ Database
D1
🔧 Main Features
Graph Search, FTS, Node/Edge Management
Key Functions
createNode()
Node operations
getNodes()
Function getNodes
getNode()
Function getNode
updateNode()
Function updateNode
deleteNode()
Function deleteNode
createEdge()
Edge operations
getEdges()
Function getEdges
deleteEdge()
Function deleteEdge
searchNodes()
Search operations
getGraph()
Graph operations
migrateKVToSearch()
Migration operations
processKVRecipe()
Function processKVRecipe
createRecipeNodeFromKV()
Function createRecipeNodeFromKV
createIngredientNodesFromKV()
Function createIngredientNodesFromKV
createTagNodesFromKV()
Function createTagNodesFromKV
createOrGetNodeFromKV()
Function createOrGetNodeFromKV
createEdgeFromKV()
Function createEdgeFromKV
checkExistingNode()
Function checkExistingNode
chunkArray()
Utility functions for KV migration
normalizeIngredientName()
Function normalizeIngredientName
categorizeIngredient()
Function categorizeIngredient
categorizeTag()
Function categorizeTag
extractQuantity()
Function extractQuantity
extractUnit()
Function extractUnit
decompressKVData()
Decompress KV data (similar to shared/kv-storage.js)
debugKVStorage()
Debug function to inspect KV storage
Source Code Preview
// Recipe Search Database Worker
// Provides graph-based search capabilities for recipes
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
const path = url.pathname;
const method = request.method;
// CORS headers
const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
};
try {
// Handle preflight requests
if (method === 'OPTIONS') {
return new Response(null, { headers: corsHeaders });
}
// Route handling
if (path === '/api/nodes' && method === 'POST') {
return await createNode(request, env, corsHeaders);
} else if (path === '/api/nodes' && method === 'GET') {
return await getNodes(request, env, corsHeaders);
} else if (path.startsWith('/api/nodes/') && method === 'GET') {
const nodeId = path.split(...
Clipped Recipe DB Worker
Recipe CRUD, Image Uploads, Health Monitoring
📍 Location
clipped-recipe-db-worker/
🗄️ Database
D1
🔧 Main Features
Recipe CRUD, Image Uploads, Health Monitoring
API Endpoints
GET /health
Health check endpoint
GET /recipes
Get all recipes
POST /recipe
Get all recipes
POST /upload-image
Upload image
POST /clip
Clip recipe from URL
Key Functions
getRecipes()
Database functions
getRecipeById()
Function getRecipeById
createRecipe()
Function createRecipe
updateRecipe()
Function updateRecipe
deleteRecipe()
Function deleteRecipe
updateRecipeImage()
Function updateRecipeImage
uploadImage()
Image upload function
extractRecipeFromUrl()
Recipe extraction from URL
decodeHtmlEntities()
Helper function to decode HTML entities
checkSystemHealth()
Health check function
Source Code Preview
// Recipe API with SQLite (D1) and image upload support
export default {
async fetch(request, env) {
const url = new URL(request.url);
const pathname = url.pathname;
// CORS headers
const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
};
// Handle preflight requests
if (request.method === 'OPTIONS') {
return new Response(null, { headers: corsHeaders });
}
try {
// Health check endpoint
if (pathname === '/health' && request.method === 'GET') {
try {
// Perform health checks
const healthStatus = await checkSystemHealth(env);
return new Response(JSON.stringify(healthStatus), {
status: 200,
headers: { ...corsHeaders, 'Content-Type': 'application/json' }
});
} catch (error) {
console...
Recipe Scraper Worker
JSON-LD Extraction, Batch Processing, KV Storage
📍 Location
recipe-scraper/
🗄️ Database
KV
🔧 Main Features
JSON-LD Extraction, Batch Processing, KV Storage
Key Functions
decodeHtmlEntities()
Decode HTML entities
normalizeIngredients()
Normalize ingredients to ensure they're always arrays
normalizeInstructions()
Normalize instructions
isRecipeType()
Helper to check if a type value represents a schema.org/Recipe
validateRecipeSchema()
Validate if the JSON-LD object is a valid schema.org/Recipe
extractRecipeData()
Extract recipe data from JSON-LD
processRecipeUrl()
Process a single URL
Source Code Preview
/**
* Recipe Scraper
* Scrapes recipe data from URLs using JSON-LD structured data
* Stores recipes in KV database with hashed URL as key
*/
import {
generateRecipeId,
saveRecipeToKV,
getRecipeFromKV,
listRecipesFromKV,
deleteRecipeFromKV
} from '../shared/kv-storage.js';
// Decode HTML entities
function decodeHtmlEntities(text) {
if (typeof text !== 'string') return text;
const entities = {
'&': '&',
'<': '<',
'>': '>',
'"': '"',
''': "'",
''': "'",
' ': ' ',
'©': '©',
'®': '®',
'™': '™'
};
return text.replace(/&[#\w]+;/g, entity => {
return entities[entity] || entity;
});
}
// Normalize ingredients to ensure they're always arrays
function normalizeIngredients(ingredients) {
if (!ingredients) return [];
if (typeof ingredients === 'string') return [decodeHtmlEntities(ingredients)];
if (Array.isArray(ingredients)) {
return ingredients.map(ing => {
...
Recipe Clipper Worker
AI Extraction, GPT-4o-mini, Smart Caching
📍 Location
clipper/
🗄️ Database
KV + AI
🔧 Main Features
AI Extraction, GPT-4o-mini, Smart Caching
API Endpoints
POST /clip
Clip recipe from URL using GPT-OSS-20B
GET /cached
Get cached recipe by URL
DELETE /cached
Get cached recipe by URL
GET /health
Health check endpoint
Key Functions
extractRecipeWithGPT()
Extract recipe using GPT-OSS-20B model
extractDescriptionFromHTML()
Extract description from HTML
extractAuthorFromHTML()
Extract author from HTML
extractDateFromHTML()
Extract publication date from HTML
extractYieldFromHTML()
Extract recipe yield (servings) from HTML
extractCategoryFromHTML()
Extract recipe category from HTML
extractCuisineFromHTML()
Extract recipe cuisine from HTML
extractPrepTimeFromHTML()
Extract prep time from HTML
extractCookTimeFromHTML()
Extract cook time from HTML
extractTotalTimeFromHTML()
Extract total time from HTML
convertTimeToISO8601()
Convert time string to ISO 8601 format
extractKeywordsFromHTML()
Extract keywords from HTML
extractNutritionFromHTML()
Extract nutrition information from HTML
extractRatingFromHTML()
Extract rating information from HTML
extractVideoFromHTML()
Extract video information from HTML
extractInstructionsFromHTML()
Fallback function to extract instructions from HTML when AI model fails
cleanJsonContent()
Clean up common JSON formatting issues
cleanHtmlForGPT()
Clean HTML content for GPT processing - optimized for token reduction
extractRecipeContent()
Extract recipe-specific content to prioritize important parts
callGPTModel()
Call GPT-OSS-20B model using Cloudflare Workers AI
extractRecipeFromAIResponse()
Extract recipe data from AI response (for testing)
extractRecipeFromJsonLd()
Extract recipe from JSON-LD structured data
findRecipeInJsonLd()
Find Recipe object in JSON-LD data (handles nested structures)
normalizeJsonLdRecipe()
Normalize JSON-LD recipe to our expected format
normalizeImage()
Helper function to normalize image field
normalizeAuthor()
Helper function to normalize author field
normalizeYield()
Helper function to normalize yield field
normalizeKeywords()
Helper function to normalize keywords
normalizeIngredients()
Helper function to normalize ingredients
normalizeInstructions()
Helper function to normalize instructions
Source Code Preview
// Recipe Clipper Worker using Cloudflare Workers AI with GPT-4o-mini model
import {
generateRecipeId,
saveRecipeToKV,
getRecipeFromKV,
deleteRecipeFromKV
} from '../../shared/kv-storage.js';
export default {
async fetch(request, env) {
const url = new URL(request.url);
const pathname = url.pathname;
// CORS headers
const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
};
// Handle preflight requests
if (request.method === 'OPTIONS') {
return new Response(null, { headers: corsHeaders });
}
try {
// Clip recipe from URL using GPT-OSS-20B
if (pathname === '/clip' && request.method === 'POST') {
const body = await request.json();
const pageUrl = body.url;
if (!pageUrl) {
return new Response('URL is required', {
s...
Recipe Crawler Worker
Batch Processing, Health Monitoring, Progress Tracking
📍 Location
crawler/
🗄️ Database
N/A
🔧 Main Features
Batch Processing, Health Monitoring, Progress Tracking
Source Code Preview
#!/usr/bin/env python3
"""
Recipe Crawler
A Python script that can pass a list of URLs to the recipe scraper
"""
import requests
import json
import time
import argparse
import sys
from typing import List, Dict, Any
from urllib.parse import urlparse, urljoin
import logging
import re
from bs4 import BeautifulSoup
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s',
handlers=[
logging.FileHandler('crawler.log'),
logging.StreamHandler(sys.stdout)
]
)
logger = logging.getLogger(__name__)
class RecipeCrawler:
def __init__(self, scraper_url: str = "https://recipe-scraper.nolanfoster.workers.dev"):
"""
Initialize the recipe crawler
Args:
scraper_url: URL of the recipe scraper worker
"""
self.scraper_url = scraper_url.rstrip('/')
self.session = requests.Session()
self.session.headers.update({
'User-Agent': 'Re...
KV Storage Library
Compression, ID Generation, CRUD Operations
📍 Location
shared/
🔧 Main Features
Compression, ID Generation, CRUD Operations
Key Functions
generateRecipeId()
Utility function to generate a unique ID from URL
compressData()
Compress data using gzip and encode as base64
decompressData()
Decompress data using gzip from base64
saveRecipeToKV()
Save recipe to KV storage
getRecipeFromKV()
Get recipe from KV storage
listRecipesFromKV()
List all recipes from KV storage (with pagination)
deleteRecipeFromKV()
Delete recipe from KV storage
recipeExistsInKV()
Check if recipe exists in KV storage
getRecipeMetadata()
Get recipe metadata without full data
Source Code Preview
/**
* Shared KV Storage Library
* Common functions for recipe storage and retrieval using Cloudflare KV
* Used by both recipe-clipper and recipe-scraper workers
*/
// Utility function to generate a unique ID from URL
export function generateRecipeId(url) {
const encoder = new TextEncoder();
const data = encoder.encode(url);
return crypto.subtle.digest('SHA-256', data)
.then(hash => {
const hashArray = Array.from(new Uint8Array(hash));
return hashArray.map(b => b.toString(16).padStart(2, '0')).join('');
});
}
// Compress data using gzip and encode as base64
export async function compressData(data) {
const encoder = new TextEncoder();
const jsonString = JSON.stringify(data);
const jsonBytes = encoder.encode(jsonString);
// Use CompressionStream for gzip compression
const cs = new CompressionStream('gzip');
const writer = cs.writable.getWriter();
const reader = cs.readable.getReader();
writer.write(jsonBytes);
writer.close();
co...
Testing
Each worker includes comprehensive testing to ensure reliability and functionality.
Test Structure
Unit Tests
- Core functionality testing
- API endpoint validation
- Error handling verification
Integration Tests
- Worker-to-worker communication
- Database operations
- Storage system integration
End-to-End Tests
- Complete workflow testing
- Real URL processing
- Performance validation
Running Tests
Recipe Search DB
cd recipe-search-db
npm test
Clipped Recipe DB
cd clipped-recipe-db-worker
npm test
Recipe Scraper
cd recipe-scraper
npm test
Recipe Clipper
cd clipper
npm test
Recipe Crawler
cd crawler
npm test
Deployment
All workers are designed for easy deployment to Cloudflare Workers with proper environment configuration.
Deployment Commands
1. Build & Deploy
npm run deploy
2. Environment Variables
wrangler secret put MY_SECRET
3. Database Binding
wrangler d1 execute DB --file=./schema.sql
Environment Configuration
Each worker requires specific environment variables and bindings:
- D1 Databases: For SQLite storage
- KV Namespaces: For caching and temporary storage
- R2 Buckets: For image storage
- AI Bindings: For Workers AI functionality
Troubleshooting
Common issues and solutions for the Seasoned recipe app workers.
Database Connection Errors
Symptoms: 500 errors, database not found
Solutions:
- Verify D1 database ID in wrangler.toml
- Check database binding names
- Ensure database exists and is accessible
KV Storage Issues
Symptoms: Storage failures, data corruption
Solutions:
- Check KV namespace bindings
- Verify namespace permissions
- Check for data compression issues
CORS Errors
Symptoms: Frontend can't access API
Solutions:
- Verify CORS headers in worker responses
- Check frontend origin configuration
- Ensure preflight requests are handled
Image Upload Failures
Symptoms: 500 errors on image upload
Solutions:
- Check R2 bucket permissions
- Verify bucket binding names
- Check file size limits
Debug Mode
Enable debug logging to troubleshoot issues:
# Enable worker logging
wrangler tail
# Check worker logs
wrangler logs tail
# Test endpoints locally
wrangler dev