Skip to content

Add OpenRouter AI integration with dynamic model fetching#281

Draft
Copilot wants to merge 3 commits intodevelopfrom
copilot/add-openrouter-ai-integration
Draft

Add OpenRouter AI integration with dynamic model fetching#281
Copilot wants to merge 3 commits intodevelopfrom
copilot/add-openrouter-ai-integration

Conversation

Copy link
Contributor

Copilot AI commented Jan 13, 2026

Implements OpenRouter AI provider integration allowing users to dynamically fetch and select from available models instead of being limited to predefined model lists.

Backend Changes

  • includes/functions.php: Added OpenRouter provider config with supports_dynamic_models flag and model_list_endpoint
  • includes/API/API.php:
    • Implemented call_openrouter_api() using OpenAI-compatible format
    • Added REST endpoint GET /wp-json/wp/v2/docs/ai/models/{provider} for model fetching
    • Implemented fetch_openrouter_models() with proper authentication headers (HTTP-Referer, X-Title)

Frontend Changes

  • src/components/Settings/AiSettings.js:
    • Added fetchOpenRouterModels() to call REST endpoint and update provider model list
    • Conditional "Fetch Available Models" button rendered when OpenRouter is selected
    • State management for loading states and fetched models

Usage

// OpenRouter config in wedocs_get_ai_provider_configs()
'openrouter' => [
    'name' => 'OpenRouter',
    'endpoint' => 'https://openrouter.ai/api/v1/chat/completions',
    'models' => [],  // Populated dynamically
    'supports_dynamic_models' => true,
    'model_list_endpoint' => 'https://openrouter.ai/api/v1/models'
]

User workflow: Select OpenRouter → Enter API key → Click "Fetch Available Models" → Select model → Generate AI content.

Original prompt

Copilot Chat Conversation Export: OpenRouter AI integration implementation guide

User: @iftakharul-islam
Thread URL: https://github.com/copilot/c/54288940-3663-44f1-91fd-3fe0ecc642f2

@iftakharul-islam

I want to add OpenRouter ai integration
Supporting Docs: https://openrouter.ai/docs/quickstart
user will select available models from it and don't bind it only with predefined models.

give me the implementation details proper guide on how we can achive it our plugin

Copilot

I'll help you create a comprehensive implementation guide for integrating OpenRouter AI into the weDevsOfficial/wedocs-plugin. Let me first examine the repository structure to understand the current architecture. Let me gather more information about the existing AI integration and plugin structure: Based on my analysis of the weDocs plugin repository, I can provide you with a comprehensive implementation guide for adding OpenRouter AI integration. The plugin already has a well-structured AI provider system in place, making it relatively straightforward to add OpenRouter.

📋 Implementation Guide: OpenRouter AI Integration

Current Architecture Overview

Your plugin uses a centralized AI provider system with:

  • PHP Backend: Handles API calls via includes/API/API.php
  • JavaScript Frontend: src/utils/aiService.js for UI interactions
  • Settings Management: React-based settings in src/components/Settings/AiSettings.js
  • Configuration: Centralized configs in includes/functions.php via wedocs_get_ai_provider_configs()

🎯 Step-by-Step Implementation

Step 1: Add OpenRouter Provider Configuration

File: includes/functions.php

Locate the wedocs_get_ai_provider_configs() function (around line 594) and add OpenRouter configuration:

function wedocs_get_ai_provider_configs() {
    $provider_configs = [
        'openai' => [
            // ... existing OpenAI config
        ],
        'anthropic' => [
            // ... existing Anthropic config
        ],
        'google' => [
            // ... existing Google config
        ],
        // ADD THIS NEW OPENROUTER CONFIGURATION
        'openrouter' => [
            'name' => 'OpenRouter',
            'endpoint' => 'https://openrouter.ai/api/v1/chat/completions',
            'models' => [], // Empty initially - will be fetched dynamically
            'requires_key' => true,
            'supports_dynamic_models' => true, // Flag to indicate dynamic model fetching
            'model_list_endpoint' => 'https://openrouter.ai/api/v1/models'
        ]
    ];

    return apply_filters( 'wedocs_ai_provider_configs', $provider_configs );
}

Step 2: Implement OpenRouter API Call Method

File: includes/API/API.php

Add a new method to handle OpenRouter API calls (add after the existing provider methods like call_openai_api, call_anthropic_api, etc.):

/**
 * Call OpenRouter API
 *
 * @since 2.1.16
 *
 * @param string $endpoint API endpoint
 * @param string $api_key API key
 * @param string $model Model to use
 * @param string $prompt User prompt
 * @param string $system_prompt System prompt
 * @param int $max_tokens Max tokens
 * @param float $temperature Temperature
 *
 * @return array Response with content and usage
 * @throws \Exception
 */
private function call_openrouter_api( $endpoint, $api_key, $model, $prompt, $system_prompt, $max_tokens, $temperature ) {
    // Validate API key
    if ( empty( $api_key ) || ! is_string( $api_key ) ) {
        throw new \Exception( __( 'OpenRouter API key is required.', 'wedocs' ) );
    }

    // Prepare request body - OpenRouter uses OpenAI-compatible format
    $body = [
        'model' => $model,
        'messages' => [
            [
                'role' => 'system',
                'content' => $system_prompt
            ],
            [
                'role' => 'user',
                'content' => $prompt
            ]
        ],
        'max_tokens' => $max_tokens,
        'temperature' => $temperature
    ];

    // Make API request
    $response = wp_remote_post(
        $endpoint,
        [
            'timeout' => 60,
            'headers' => [
                'Content-Type' => 'application/json',
                'Authorization' => 'Bearer ' . $api_key,
                'HTTP-Referer' => home_url(), // Required by OpenRouter
                'X-Title' => get_bloginfo( 'name' ) // Optional but recommended
            ],
            'body' => wp_json_encode( $body )
        ]
    );

    // Handle errors
    if ( is_wp_error( $response ) ) {
        throw new \Exception( $response->get_error_message() );
    }

    $response_code = wp_remote_retrieve_response_code( $response );
    $response_body = wp_remote_retrieve_body( $response );
    $data = json_decode( $response_body, true );

    if ( $response_code !== 200 ) {
        $error_message = $data['error']['message'] ?? __( 'OpenRouter API request failed', 'wedocs' );
 ...

</details>



<!-- START COPILOT CODING AGENT TIPS -->
---

💬 We'd love your input! Share your thoughts on Copilot coding agent in our [2 minute survey](https://gh.io/copilot-coding-agent-survey).

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 13, 2026

Important

Review skipped

Bot user detected.

To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.


Comment @coderabbitai help to get the list of available commands and usage tips.

Copilot AI and others added 2 commits January 13, 2026 10:13
- Add OpenRouter provider config in wedocs_get_ai_provider_configs()
- Implement call_openrouter_api() method following OpenAI-compatible format
- Add REST endpoint for fetching OpenRouter models dynamically
- Update frontend with model fetching UI and state management
- Add "Fetch Models" button for OpenRouter in settings

Co-authored-by: iftakharul-islam <[email protected]>
Copilot AI changed the title [WIP] Add OpenRouter AI integration for model selection Add OpenRouter AI integration with dynamic model fetching Jan 13, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants