KyroKyro

AI Providers

Configure AI providers for KyroJudge — OpenAI, Anthropic, Gemini, Azure, and Ollama.

Kyro uses @kyro/shared's ProviderFactory to create provider instances. Providers implement the AIProvider interface and handle all API communication.

ProviderFactory

import { ProviderFactory } from '@kyro/shared';
 
const provider = ProviderFactory.create({
  provider: 'openai',    // Provider name
  model: 'gpt-4o',       // Model ID
  apiKey: '...',         // API key (or use env vars)
  clientConfig: {},      // SDK client options (optional)
  chatConfig: {},        // Per-request options (optional)
});

OpenAI

const provider = ProviderFactory.create({
  provider: 'openai',
  model: 'gpt-4o',
  apiKey: process.env.OPENAI_API_KEY,
  chatConfig: {
    temperature: 0,
    max_tokens: 512,
  },
});

Supported models: gpt-4o, gpt-4o-mini, gpt-4-turbo, o1, o1-mini, and all other OpenAI chat models.


Anthropic

const provider = ProviderFactory.create({
  provider: 'anthropic',
  model: 'claude-opus-4-6',
  apiKey: process.env.ANTHROPIC_API_KEY,
});

Supported models: claude-opus-4-6, claude-sonnet-4-6, claude-haiku-4-5-20251001, and all other Anthropic models.


Google Gemini

const provider = ProviderFactory.create({
  provider: 'gemini',
  model: 'gemini-2.0-flash',
  apiKey: process.env.GEMINI_API_KEY,
});

Supported models: gemini-2.0-flash, gemini-1.5-pro, gemini-1.5-flash.


Azure OpenAI

const provider = ProviderFactory.create({
  provider: 'azure',
  model: 'gpt-4o',
  apiKey: process.env.AZURE_OPENAI_API_KEY,
  clientConfig: {
    endpoint: process.env.AZURE_OPENAI_ENDPOINT,
    deployment: 'my-gpt4o-deployment',
    apiVersion: '2024-02-01',
  },
});

Ollama (local models)

const provider = ProviderFactory.create({
  provider: 'ollama',
  model: 'llama3.2',
  clientConfig: {
    host: 'http://localhost:11434',
  },
});
💡Tip

Ollama requires the ollama package and a running Ollama server. Install with npm install ollama.


Custom providers

Implement the AIProvider interface from @kyro/shared to use any other AI service:

import type { AIProvider } from '@kyro/shared';
 
class MyCustomProvider implements AIProvider {
  async generateText(prompt: string): Promise<string> {
    // Call your API
    const response = await myApi.complete(prompt);
    return response.text;
  }
}
 
const judge = new Judge('./pipeline.yaml', new MyCustomProvider());

Summary

ProviderPackage dependencyEnv variable
OpenAIopenaiOPENAI_API_KEY
Anthropic@anthropic-ai/sdkANTHROPIC_API_KEY
Gemini@google/generative-aiGEMINI_API_KEY
AzureopenaiAZURE_OPENAI_API_KEY
Ollamaollama— (uses host)

On this page