-
Notifications
You must be signed in to change notification settings - Fork 287
Add Cerebras [WIP] #476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add Cerebras [WIP] #476
Conversation
…w after). 3.1M input tokens, 11k output tokens. Used Cline. Prompt: ``` I want to add Cerebras as a provider to Kiln. - Add it in ML model list as provider (will cause type errors where we need to write code). Run the type checker (if you need to use terminal and can't see them automatically run `uv run pyright .` to ) to find issues. - It should be modeled like Ollama: a custom OpenAI compatible endpoing based router. See adapter_registry.py. Use it's open AI compatible endpoint `https://api.cerebras.ai/v1 ` - run generate_schema.sh to update our frontend APIs to add the new provider. This will cause type errors you need to respove on the front end. - add necessary UI in connect_provider and other places ```
Important Review skippedIgnore keyword(s) in the title. ⛔ Ignored keywords (2)
Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
📊 Coverage ReportOverall Coverage: 91% Diff: origin/main...HEAD
Summary
Line-by-lineView line-by-line diff coverageapp/desktop/studio_server/provider_api.pyLines 296-305 296 parse_api_field(key_data, "Project Location"),
297 )
298 case ModelProviderName.together_ai:
299 return await connect_together(parse_api_key(key_data))
! 300 case ModelProviderName.cerebras:
! 301 return await connect_cerebras(parse_api_key(key_data))
302 case (
303 ModelProviderName.kiln_custom_registry
304 | ModelProviderName.kiln_fine_tune
305 | ModelProviderName.openai_compatible Lines 354-362 354 Config.shared().vertex_location = None
355 case ModelProviderName.together_ai:
356 Config.shared().together_api_key = None
357 case ModelProviderName.cerebras:
! 358 Config.shared().cerebras_api_key = None
359 case (
360 ModelProviderName.kiln_custom_registry
361 | ModelProviderName.kiln_fine_tune
362 | ModelProviderName.openai_compatible Lines 792-813 792 )
793
794
795 async def connect_cerebras(key: str):
! 796 try:
! 797 headers = {
798 "Authorization": f"Bearer {key}",
799 "Content-Type": "application/json",
800 }
! 801 response = requests.get("https://api.cerebras.ai/v1/models", headers=headers)
802
! 803 if response.status_code == 401:
! 804 return JSONResponse(
805 status_code=401,
806 content={"message": "Failed to connect to Cerebras. Invalid API key."},
807 )
! 808 elif response.status_code != 200:
! 809 return JSONResponse(
810 status_code=400,
811 content={
812 "message": f"Failed to connect to Cerebras. Error: [{response.status_code}]"
813 }, Lines 812-826 812 "message": f"Failed to connect to Cerebras. Error: [{response.status_code}]"
813 },
814 )
815 else:
! 816 Config.shared().cerebras_api_key = key
! 817 return JSONResponse(
818 status_code=200,
819 content={"message": "Connected to Cerebras"},
820 )
! 821 except Exception as e:
! 822 return JSONResponse(
823 status_code=400,
824 content={"message": f"Failed to connect to Cerebras. Error: {str(e)}"},
825 ) libs/core/kiln_ai/adapters/adapter_registry.pyLines 185-194 185 "api_key": Config.shared().huggingface_api_key,
186 },
187 ),
188 )
! 189 case ModelProviderName.cerebras:
! 190 return LiteLlmAdapter(
191 kiln_task=kiln_task,
192 base_adapter_config=base_adapter_config,
193 config=LiteLlmConfig(
194 run_config_properties=run_config_properties, libs/core/kiln_ai/adapters/model_adapters/litellm_adapter.pyLines 330-338 330 litellm_provider_name = "vertex_ai"
331 case ModelProviderName.together_ai:
332 litellm_provider_name = "together_ai"
333 case ModelProviderName.cerebras:
! 334 litellm_provider_name = "cerebras"
335 case ModelProviderName.openai_compatible:
336 is_custom = True
337 case ModelProviderName.kiln_custom_registry:
338 is_custom = True libs/core/kiln_ai/adapters/provider_tools.pyLines 383-392 383 case ModelProviderName.vertex:
384 return "Google Vertex AI"
385 case ModelProviderName.together_ai:
386 return "Together AI"
! 387 case ModelProviderName.cerebras:
! 388 return "Cerebras"
389 case _:
390 # triggers pyright warning if I miss a case
391 raise_exhaustive_enum_error(enum_id)
|
Qwen-Coder attempt at adding a provider. Zero human edits (will revie…w after).
3.1M input tokens, 11k output tokens. Used Cline.
Prompt:
What does this PR do?
Related Issues
Contributor License Agreement
I, @, confirm that I have read and agree to the Contributors License Agreement.
Checklists