feat: Add Oracle Cloud Infrastructure (OCI) Generative AI client support#718
feat: Add Oracle Cloud Infrastructure (OCI) Generative AI client support#718fede-kamel wants to merge 19 commits intocohere-ai:mainfrom
Conversation
0e341c1 to
fdebc00
Compare
a284ea8 to
d7c7ef6
Compare
|
@walterbm-cohere @daniel-cohere @billytrend-cohere Hey maintainers, Friendly bump on this PR - would appreciate your feedback when you have a chance. Happy to address any concerns or make changes as needed. Thanks. |
Update: Thinking parameter support + test resultsAdded support for the Models tested against live OCI endpointsLUIGI_FRA_API profile (eu-frankfurt-1):
API_KEY_AUTH profile (us-chicago-1):
Thinking parameter (unit tests, no OCI credentials needed):
Test summaryNote: |
49f92cc to
8b9c63d
Compare
|
Addressed Bugbot feedback:
|
Implements full OCI Generative AI integration following the proven AWS client architecture pattern. Features: - OciClient (v1) and OciClientV2 (v2) for complete API coverage - All authentication methods: config file, direct credentials, instance principal, resource principal - Complete API support: embed, chat, generate, rerank (including streaming variants) - Automatic model name normalization (adds 'cohere.' prefix if needed) - Request/response transformation between Cohere and OCI formats - Comprehensive integration tests with multiple test suites - Full documentation with usage examples Implementation Details: - Uses httpx event hooks for clean request/response interception - Lazy loading of OCI SDK as optional dependency - Follows BedrockClient architecture pattern for consistency - Supports all OCI regions and compartment-based access control Testing: - 40+ integration tests across 5 test suites - Tests all authentication methods - Validates all APIs (embed, chat, generate, rerank, streaming) - Tests multiple Cohere models (embed-v3, light-v3, multilingual-v3, command-r-plus, rerank-v3) - Error handling and edge case coverage Documentation: - Comprehensive docstrings with usage examples - README section with authentication examples - Installation instructions for OCI optional dependency
Updates: - Fixed OCI signer integration to use requests.PreparedRequest - Fixed embed request transformation to only include provided optional fields - Fixed embed response transformation to include proper meta structure with usage/billing info - Fixed test configuration to use OCI_PROFILE environment variable - Updated input_type handling to match OCI API expectations (SEARCH_DOCUMENT vs DOCUMENT) Test Results: - 7/22 tests passing including basic embed functionality - Remaining work: chat, generate, rerank endpoint transformations
- Implemented automatic V1/V2 API detection based on request structure - Added V2 request transformation for messages format - Added V2 response transformation for Command A models - Removed hardcoded region-specific model OCIDs - Now uses display names (e.g., cohere.command-a-03-2025) that work across all OCI regions - V2 chat fully functional with command-a-03-2025 model - Updated tests to use command-a-03-2025 for V2 API testing Test Results: 14 PASSED, 8 SKIPPED, 0 FAILED
- Remove unused imports (base64, hashlib, io, construct_type) - Sort imports according to ruff standards
…issues - Fix OCI pip extras installation by moving from poetry groups to extras - Changed [tool.poetry.group.oci] to [tool.poetry.extras] - This enables 'pip install cohere[oci]' to work correctly - Fix streaming to stop properly after [DONE] signal - Changed 'break' to 'return' in transform_oci_stream_wrapper - Prevents continued chunk processing after stream completion
- Add support for OCI profiles using security_token_file - Load private key properly using oci.signer.load_private_key_from_file - Use SecurityTokenSigner for session-based authentication - This enables use of OCI CLI session tokens for authentication
This commit addresses all copilot feedback and fixes V2 API support: 1. Fixed V2 embed response format - V2 expects embeddings as dict with type keys (float, int8, etc.) - Added is_v2_client parameter to properly detect V2 mode - Updated transform_oci_response_to_cohere to preserve dict structure for V2 2. Fixed V2 streaming format - V2 SDK expects SSE format with "data: " prefix and double newline - Fixed text extraction from OCI V2 events (nested in message.content[0].text) - Added proper content-delta and content-end event types for V2 - Updated transform_oci_stream_wrapper to output correct format based on is_v2 3. Fixed stream [DONE] signal handling - Changed from break to return to stop generator completely - Prevents further chunk processing after [DONE] 4. Added skip decorators with clear explanations - OCI on-demand models don't support multiple embedding types - OCI TEXT_GENERATION models require fine-tuning (not available on-demand) - OCI TEXT_RERANK models require fine-tuning (not available on-demand) 5. Added comprehensive V2 tests - test_embed_v2 with embedding dimension validation - test_embed_with_model_prefix_v2 - test_chat_v2 - test_chat_stream_v2 with text extraction validation All 17 tests now pass with 7 properly documented skips.
- Add comprehensive limitations section to README explaining what's available on OCI on-demand inference vs. what requires fine-tuning - Improve OciClient and OciClientV2 docstrings with: - Clear list of supported APIs - Notes about generate/rerank limitations - V2-specific examples showing dict-based embedding responses - Add checkmarks and clear categorization of available vs. unavailable features - Link to official OCI Generative AI documentation for latest model info
…sion
This commit fixes two issues identified in PR review:
1. V2 response detection overriding passed parameter
- Previously: transform_oci_response_to_cohere() would re-detect V2 from
OCI response apiFormat field, overriding the is_v2 parameter
- Now: Uses the is_v2 parameter passed in (determined from client type)
- Why: The client type (OciClient vs OciClientV2) already determines the
API version, and re-detecting can cause inconsistency
2. Security token file path not expanded before opening
- Previously: Paths like ~/.oci/token would fail because Python's open()
doesn't expand tilde (~) characters
- Now: Uses os.path.expanduser() to expand ~ to user's home directory
- Why: OCI config files commonly use ~ notation for paths
Both fixes maintain backward compatibility and all 17 tests continue to pass.
- Fix authentication priority to prefer API key auth over session-based - Transform V2 content list items type field to uppercase for OCI format - Remove debug logging statements All tests passing (17 passed, 7 skipped as expected)
Support the thinking/reasoning feature for command-a-reasoning-08-2025 on OCI. Transforms Cohere's thinking parameter (type, token_budget) to OCI format and handles thinking content in both non-streaming and streaming responses.
- Remove unused response_mapping and stream_response_mapping dicts - Remove unused transform_oci_stream_response function - Remove unused imports (EmbedResponse, Generation, etc.) - Fix crash when thinking parameter is explicitly None - Fix V2 chat response role not lowercased (ASSISTANT -> assistant) - Fix V2 finish_reason incorrectly lowercased (should stay uppercase) - Add unit tests for thinking=None, role lowercase, and finish_reason
- Fix thinking token_budget → tokenBudget (camelCase for OCI API) - Add V2 response toolCalls → tool_calls conversion for SDK compatibility - Update test for tokenBudget casing - Add test for tool_calls conversion
OCI doesn't provide a generation ID in responses. Previously used modelId which is the model name (e.g. 'cohere.command-r-08-2024'), not a unique generation identifier. Now generates a proper UUID.
- Add validation for direct credentials (user_id requires fingerprint and tenancy_id) - Emit message-end event for V2 streaming before [DONE]
3bf4c54 to
b78c63e
Compare
|
@billytrend-cohere @mkozakov @sanderland @abdullahkady — would appreciate a review on this when you have a moment. This PR has been rebased on the latest What this adds: Oracle Cloud Infrastructure (OCI) Generative AI client support, following the same architectural pattern as the existing Testing: 14 integration tests passing against the OCI Generative AI service. Tested with Command R, Command A, and all embed v3 models. This would bring OCI to parity with the existing Bedrock integration and benefit enterprise customers running Cohere models on Oracle Cloud. Happy to address any feedback. Thank you. |
| signer = oci.auth.signers.SecurityTokenSigner( | ||
| token=security_token, | ||
| private_key=private_key, | ||
| ) |
There was a problem hiding this comment.
Session-based auth broken by wrong signer priority
High Severity
OCI session-based config profiles (created via oci session authenticate) contain both user and security_token_file fields. Since "user" in oci_config is checked before "security_token_file" in oci_config, session-based profiles are always incorrectly handled as standard API key auth. The SecurityTokenSigner branch is unreachable for config-file-loaded profiles, causing authentication failures for session-based users because the session key's fingerprint isn't registered as a permanent API key.
| message_end_event = {"type": "message-end"} | ||
| yield b"data: " + json.dumps(message_end_event).encode("utf-8") + b"\n\n" | ||
| # Return to stop the generator completely | ||
| return |
There was a problem hiding this comment.
V1 streaming never emits required stream-end event
Medium Severity
For V1 streaming, when [DONE] is received, the generator just returns without emitting a stream-end event. The V1 StreamedChatResponse protocol expects a final stream-end event containing finish_reason and the complete NonStreamedChatResponse. Any consumer code checking for event_type == "stream-end" to obtain the finish reason or full response will silently receive nothing.
| # Emit message-end event for V2 before stopping | ||
| if is_v2: | ||
| message_end_event = {"type": "message-end"} | ||
| yield b"data: " + json.dumps(message_end_event).encode("utf-8") + b"\n\n" |
There was a problem hiding this comment.
V2 streaming message-end lacks finish_reason and usage
Medium Severity
The V2 message-end event is emitted as {"type": "message-end"} without a delta field. The SDK's ChatMessageEndEventDelta type carries finish_reason and usage, which are the standard way for V2 streaming consumers to get completion metadata. The finishReason from the OCI event is silently discarded when it's transformed into a bare content-end event at line 1008.
| "content": delta_content, | ||
| } | ||
| }, | ||
| } |
There was a problem hiding this comment.
V2 streaming omits message-start and content-start events
Medium Severity
The V2 stream transformation only emits content-delta, content-end, and message-end events. The V2 protocol expects message-start (carrying the message id) and content-start events before any content-delta. Consumers relying on message-start to obtain the response ID or on content-start for content lifecycle tracking will not receive these events.
| } | ||
|
|
||
| action = action_map.get(endpoint, endpoint) | ||
| return f"{base}/{api_version}/actions/{action}" |
There was a problem hiding this comment.
Unused stream parameter in get_oci_url function
Low Severity
The get_oci_url function accepts a stream parameter that is never referenced in the function body. The caller computes and passes the stream state, but it has zero effect on the generated URL. This dead parameter is misleading — it suggests the URL varies by streaming mode when it doesn't.
|
@sanderland Thanks for the approvals on this PR and the others (#717, #698, #697)! What are the next steps to get these merged? |


Overview
I noticed that the Cohere Python SDK has excellent integration with AWS Bedrock through the
BedrockClientimplementation. I wanted to contribute a similar integration for Oracle Cloud Infrastructure (OCI) Generative AI service to provide our customers with the same seamless experience.Motivation
Oracle Cloud Infrastructure offers Cohere's models through our Generative AI service, and many of our enterprise customers use both platforms. This integration follows the same architectural pattern as the existing Bedrock client, ensuring consistency and maintainability.
Implementation
This PR adds comprehensive OCI support with:
Features
~/.oci/config)Architecture
Testing
Documentation
Files Changed
src/cohere/oci_client.py(910 lines) - Main OCI client implementationsrc/cohere/manually_maintained/lazy_oci_deps.py(30 lines) - Lazy OCI SDK loadingtests/test_oci_client.py(393 lines) - Comprehensive integration testsREADME.md- OCI usage documentationpyproject.toml- Optional OCI dependencysrc/cohere/__init__.py- Export OciClient and OciClientV2Test Results
Skipped tests are for OCI service limitations (base models not callable via on-demand inference).
Breaking Changes
None. This is a purely additive feature.
Checklist
Note
Medium Risk
Large additive surface area that includes request rewriting, signing, and streaming transformation logic; mistakes could cause subtle runtime/compatibility issues for OCI users, but existing non-OCI clients are largely untouched.
Overview
Adds first-class Oracle Cloud Infrastructure support via new
OciClient(V1) andOciClientV2(V2), wiring requests through httpx event hooks that rewrite Cohere API calls into OCI Generative AI endpoints, sign them with OCI auth, and map OCI responses/streams back into Cohere response shapes.Introduces optional
ocidependency loading (cohere[oci]) plus extensive tests (integration + transformation unit tests) covering auth modes, embed/chat/stream behavior, and V2-specific features likethinkingand tool-call field conversions. Updates the README with OCI installation, auth examples, supported APIs, and service limitations, and exports the new clients fromcohere.__init__.Written by Cursor Bugbot for commit b78c63e. This will update automatically on new commits. Configure here.