LLM-Rosetta: Zero-Dep API Translator for OpenAI, Anthropic, Google and Streaming
hackernews
|
|
📰 뉴스
#anthropic
#gemini
#llama
#openai
#머신러닝/연구
원문 출처: hackernews · Genesis Park에서 요약 및 분석
요약
LLM-Rosetta는 허브 앤 스포크 구조와 중간 표현(IR)을 사용하여 OpenAI, Anthropic 등 다양한 LLM 제공업체의 API 형식을 상호 변환하는 파이썬 라이브러리입니다. 이를 통해 각 제공사는 하나의 변환기만으로 다른 API와 호환되며, Ollama나 HuggingFace TGI 같은 서버와 즉시 사용 가능합니다. 또한 웹 검색 및 계산기 기능을 제공하는 MCP 툴 서버인 ToolRegistry-Hub가 함께 제공됩니다. 이 프로젝트는 MIT 라이선스를 따르며 GitHub에서 세부 정보를 확인할 수 있습니다.
본문
LLM-Rosetta — A Python library for converting between different LLM provider API formats using a hub-and-spoke architecture with a central IR (Intermediate Representation). Full documentation is available at: - English: https://llm-rosetta.readthedocs.io/en/latest/ - 中文: https://llm-rosetta.readthedocs.io/zh-cn/latest/ When building applications that work with multiple LLM providers, you face an N² conversion problem — every provider pair requires its own conversion logic. LLM-Rosetta solves this with a hub-and-spoke approach: each provider only needs a single converter to/from the shared IR format. Provider A ──→ IR ──→ Provider B Provider C ──→ IR ──→ Provider D ... and so on | Provider | API Standard | Request | Response | Streaming | |---|---|---|---|---| | OpenAI | Chat Completions | ✅ | ✅ | ✅ | | OpenAI | Responses API | ✅ | ✅ | ✅ | | Anthropic | Messages API | ✅ | ✅ | ✅ | | GenAI API | ✅ | ✅ | ✅ | LLM-Rosetta works out of the box with any server that exposes OpenAI-compatible endpoints. Ollama (v0.13+) is a great example — it supports three of the four API formats that LLM-Rosetta converts between: | Ollama Endpoint | LLM-Rosetta Converter | Since | |---|---|---| /v1/chat/completions | openai_chat | Early versions | /v1/responses | openai_responses | v0.13.3 | /v1/messages | anthropic | v0.14.0 | Other compatible servers include HuggingFace TGI, vLLM, and LM Studio. - Unified IR format for messages, tool calls, and content parts - Bidirectional conversion: requests to provider format, responses from provider format - Streaming support with typed stream events - Auto-detection of provider from request/response objects - Support for text, images, tool calls, and tool results - Zero required dependencies (only typing_extensions ); provider SDKs are optional Install the core package (requires Python >= 3.8): pip install llm-rosetta # Individual providers pip install llm-rosetta[openai] pip install llm-rosetta[anthropic] pip install llm-rosetta[google] # All providers pip install llm-rosetta[openai,anthropic,google] | Extra | Packages | Description | |---|---|---| openai | openai | OpenAI Chat Completions & Responses API | anthropic | anthropic | Anthropic Messages API | google | google-genai | Google GenAI API | from llm_rosetta import OpenAIChatConverter, AnthropicConverter # Create converters openai_conv = OpenAIChatConverter() anthropic_conv = AnthropicConverter() # Convert an OpenAI response to IR, then to Anthropic format ir_messages = openai_conv.response_from_provider(openai_response) anthropic_request = anthropic_conv.request_to_provider(ir_messages) from llm_rosetta import convert, detect_provider # Automatically detect provider and convert provider = detect_provider(some_response) ir_messages = convert(some_response, direction="from_provider") from llm_rosetta import OpenAIChatConverter, GoogleGenAIConverter from llm_rosetta.types.ir import Message, ContentPart # Shared IR message history ir_messages = [] # Turn 1: Ask OpenAI ir_messages.append(Message(role="user", content=[ContentPart(type="text", text="Hello!")])) openai_request = openai_conv.request_to_provider({"messages": ir_messages}) openai_response = openai_client.chat.completions.create(**openai_request) ir_messages.extend(openai_conv.response_from_provider(openai_response)) # Turn 2: Continue with Google — full context preserved google_request = google_conv.request_to_provider({"messages": ir_messages}) - ToolRegistry — A lightweight Python framework for managing and dynamically registering tools with LLM integration support. - ToolRegistry-Hub — A ready-to-use MCP tool server built on ToolRegistry, providing web search, calculator, datetime, and more out of the box. If you use LLM-Rosetta in your research, please cite our paper: @article{ding2026llmrosetta, title={LLM-Rosetta: A Hub-and-Spoke Intermediate Representation for Cross-Provider LLM API Translation}, author={Ding, Peng}, journal={arXiv preprint arXiv:2604.09360}, year={2026} } Contributions are welcome! Please visit the GitHub repository to get started. This project is licensed under the MIT License — see the LICENSE file for details.
Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.
공유