A comprehensive React component library built on top of AI SDK 5 Alpha for creating intelligent, conversational interfaces with enterprise-grade features.
# Using pnpm (recommended)
pnpm add @conciergus/chat ai
# Using npm
npm install @conciergus/chat ai
# Using yarn
yarn add @conciergus/chat ai
Note: This library requires AI SDK 5 Alpha (ai@^5.0.0-alpha) as a peer dependency.
Wrap your application with the ConciergusProvider:
import { ConciergusProvider } from '@conciergus/chat';
import { createAnthropic } from '@ai-sdk/anthropic';
const anthropic = createAnthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
function App() {
return (
<ConciergusProvider
config={{
model: anthropic('claude-3-5-sonnet-20241022'),
apiUrl: '/api/chat',
enableTelemetry: true,
}}
>
<YourApp />
</ConciergusProvider>
);
}
import { ConciergusChatWidget } from '@conciergus/chat';
function ChatPage() {
return (
<div className="h-screen">
<ConciergusChatWidget
title="AI Assistant"
placeholder="Ask me anything..."
showMetadata={true}
enableVoice={true}
/>
</div>
);
}
Create an API route for chat streaming (Next.js example):
// app/api/chat/route.ts
import { createAnthropic } from '@ai-sdk/anthropic';
import { streamText } from 'ai';
const anthropic = createAnthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: anthropic('claude-3-5-sonnet-20241022'),
messages,
temperature: 0.7,
maxTokens: 1000,
});
return result.toDataStreamResponse();
}
For enterprise deployments with multiple AI providers:
import { ConciergusProvider, createAIGateway } from '@conciergus/chat';
const gateway = createAIGateway({
providers: [
{
name: 'anthropic',
model: anthropic('claude-3-5-sonnet-20241022'),
priority: 1,
},
{
name: 'openai',
model: openai('gpt-4-turbo'),
priority: 2,
},
],
fallbackStrategy: 'waterfall',
costOptimization: true,
});
<ConciergusProvider
config={{
gateway,
enableTelemetry: true,
telemetryConfig: {
endpoint: '/api/telemetry',
batchSize: 100,
},
}}
>
<App />
</ConciergusProvider>
Use Conciergus hooks for fine-grained control:
import { useConciergusChat, useConciergusAgent } from '@conciergus/chat';
function CustomChatComponent() {
const { messages, append, isLoading, metadata } = useConciergusChat({
api: '/api/chat',
initialMessages: [
{ role: 'assistant', content: 'Hello! How can I help you today?' }
],
});
const {
agent,
executeStep,
isRunning,
steps
} = useConciergusAgent({
model: 'claude-3-5-sonnet-20241022',
tools: {
// Your tools here
},
});
return (
<div>
{/* Custom chat interface */}
</div>
);
}
ConciergusProvider: Root provider with AI SDK 5 configurationConciergusChatWidget: Complete chat interface with all featuresConciergusMessageList: Message display with metadata and rich contentConciergusMessageItem: Individual message rendering with Markdown supportConciergusObjectStream: Real-time structured object streamingConciergusAgentControls: Visual controls for AI agentsConciergusMetadataDisplay: Telemetry and performance metricsConciergusVoiceRecorder: Voice input with speech recognitionConciergusAudioPlayer: Audio playback for TTS responsesConciergusTelemetryDashboard: Real-time monitoring dashboardConciergusModelSwitcher: Dynamic model selection interfaceConciergusCostTracker: Usage and cost monitoringConciergusDebugPanel: Development and debugging toolsEnable voice capabilities:
import { ConciergusVoiceRecorder, ConciergusAudioPlayer } from '@conciergus/chat';
function VoiceChat() {
return (
<div>
<ConciergusVoiceRecorder
onTranscription={(text) => console.log('Transcribed:', text)}
languages={['en-US', 'es-ES', 'fr-FR']}
enableNoiseReduction={true}
/>
<ConciergusAudioPlayer
src="/api/tts"
autoPlay={true}
showControls={true}
/>
</div>
);
}
Enable comprehensive monitoring:
const config = {
enableTelemetry: true,
telemetryConfig: {
endpoint: '/api/telemetry',
metrics: ['latency', 'tokens', 'cost', 'errors'],
enableOpenTelemetry: true,
exporters: ['jaeger', 'datadog'],
},
debug: process.env.NODE_ENV === 'development',
};
If you're upgrading from AI SDK 4.x:
Update Dependencies:
pnpm add ai@^5.0.0-alpha @conciergus/chat
Update Import Paths:
// Before (AI SDK 4.x)
import { useChat } from 'ai/react';
// After (AI SDK 5 + Conciergus)
import { useConciergusChat } from '@conciergus/chat';
Update Configuration:
// Before
const { messages, append } = useChat({ api: '/api/chat' });
// After
const { messages, append, metadata } = useConciergusChat({
api: '/api/chat',
enableTelemetry: true,
});
See our Migration Guide for detailed instructions.
# Clone the repository
git clone https://github.com/your-org/conciergus-ai.git
cd conciergus-ai
# Install dependencies
pnpm install
# Build the library
pnpm run build
# Run tests
pnpm test
# Start development mode
pnpm run dev
# Run all tests
pnpm test
# Run tests in watch mode
pnpm test:watch
# Run tests with coverage
pnpm test:coverage
We welcome contributions! Here are some resources to get you started:
# 1. Fork and clone the repository
git clone https://github.com/your-username/conciergus.ai.git
cd conciergus.ai
# 2. Install dependencies
pnpm install
# 3. Set up environment variables
cp .env.example .env.local
# Edit .env.local with your API keys
# 4. Build and test
pnpm run build
pnpm test
# 5. Start development
pnpm run dev
MIT License - see LICENSE for details.
Built with ❤️ using AI SDK 5 Alpha and React.