Open APIs, AI orchestration tools, and distributed compute infrastructure for developers building the next generation of intelligent applications.
Chat-based AI tutoring with automatic model routing. Supports both cloud and local Ollama instances.
Access distributed GPU compute or monetize your idle hardware. Transparent pricing and crypto payments.
All code is open source. Training exporters, eval scripts, and integration tools available on GitHub.
Track interactions, feedback quality, and export data for fine-tuning. Real-time analytics.
Run models locally with Ollama. Your data stays on your infrastructure. GDPR compliant.
Join developers building decentralized AI. Share compute, models, and learning resources.
// Route AI requests with automatic fallback
import { generateTutorReply } from '@powerprogress/ai';
const { reply, engine } = await generateTutorReply([
{ role: 'user', content: 'Explain FPGAs' }
]);
console.log(`Answered by: ${engine}`); // "primary" or "ollama"
console.log(reply);npm install @powerprogress/ai-sdk