Show HN: 비디오, 팟캐스트 및 파일에서 AI 에이전트에 대한 쿼리 가능한 팩 구축

hackernews | | 📦 오픈소스
#ai #ai 딜 #ai 에이전트 #claude #llama #mcp #데이터 처리 #로컬 모델
원문 출처: hackernews · Genesis Park에서 요약 및 분석

요약

beyin은 사용자가 직접 활용하는 영상, 팟캐스트, 기사, 로컬 파일 등 다양한 출처의 데이터를 기반으로 로컬 환경에서 검색 가능한 지식 팩을 구축하는 엔진입니다. 모든 데이터의 처리, 임베딩, 저장은 사용자의 기기 내부에서 이루어지며, 오프라인 환경에서 Ollama 모델을 통한 직접 질의도 지원합니다. 구축된 팩은 Claude Code, Cursor 등 MCP 호환 클라이언트 및 CLI를 통해 언제든 재사용할 수 있어, 특정 에이전트나 세션에 데이터가 종속되지 않는 장점이 있습니다. 또한 50개 이상의 언어를 지원하는 다국어 임베딩 모델과 자동 질의 확장 기능을 제공하여 AI 에이전트의 정보 검색 및 업무 효율성을 크게 높여줍니다.

본문

also means “brain” in Turkish. Build local, queryable packs from videos, articles, podcasts, and local files. Query them through MCP with your AI agent, or explore them directly with a local model. beyin is a local-first engine for building reusable knowledge packs from the sources you already use. Your packs stay local and reusable across MCP clients, the CLI, and fully local workflows with Ollama, so your context is not trapped inside a single tool or session. | Quick Demo Ask questions through MCP and retrieve answers from your local packs without leaving your normal workflow. beyin-main-demo.mp4 | | | Setup via CLI Choose the source types, enable local Ollama and local file workflows, and pick the Whisper model that fits your transcription needs. beyin-setup-demo.mp4 | End-to-End workflow Create a pack in the CLI, build it, connect it to your MCP agent, and query it naturally during your normal workflow. beyin-flow-1.mp4 | Note: Some source material used in these demos is from Adam Lyttle’s YouTube playlist, processed as demo input, and is licensed under CC BY 4.0. - 🔗 MCP compatible: works with Claude Code, Codex, Cursor, Windsurf, Zed and more - 🔄 Cross-agent packs: build once, then use the same local packs from any MCP-compatible agent or the CLI - 📦 Local-first pipeline: processing, embedding, and storage all happen on your machine - 🎬 Rich source support: YouTube videos and playlists, podcasts, PDFs, articles, local files - 🌍 50+ languages: multilingual embedding model out of the box - 🤖 Ollama support: run fully offline with a local model - ⚡ Plug and play: one command to connect via MCP, then manage everything by just talking to your agent - 🎯 Multi-query expansion: generates query variants automatically for better retrieval The recommended way to use beyin is through MCP with the AI agent you already use. Your packs are local to your machine, not tied to one agent session or vendor. Connect beyin to Codex, Claude Code, Cursor, or another MCP client and access the same packs from whichever one you want. For audio and video sources, beyin transcribes the content before chunking and embedding the resulting text for retrieval. - Install beyin and connect it to your agent once - Tell your agent to create a pack, add your sources, and build it - Ask questions naturally and let your agent handle retrieval It becomes part of your agent’s natural workflow, bringing in the right context exactly when you need it while keeping you focused. See Example Usage with MCP for real prompts and workflows. You can also query packs directly with a local model, no external API or agent needed. See Query with a Local Model. | Type | Examples | |---|---| | Web articles | Public URLs | | YouTube | Videos and playlists | | Podcasts | RSS feed URLs | | Local documents | .pdf , .docx , .pptx , .epub , .xlsx , .csv | | Local text | .txt , .md , .rst , .html | | Local audio | .mp3 , .m4a , .wav | | Local video | .mp4 , .mov , .mkv , .webm | beyin is built for local processing on your own machine. Use it with content you are allowed to process, preferably public, permitted sources or material you own or have rights to use. Avoid copied, paywalled, private, restricted, or illegally shared content. We recommend uv because it keeps the CLI install isolated and gives you a plain beyin command on your PATH . If you do not have uv yet, install it first: # macOS / Linux curl -LsSf https://astral.sh/uv/install.sh | sh # Windows (PowerShell) powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex" Package manager alternatives also work, for example brew install uv on macOS or winget install --id=astral-sh.uv -e on Windows. See the official uv installation guide for more options: docs.astral.sh/uv/getting-started/installation. The recommended way to install beyin is with uv : uv tool install beyin If you later want optional document support such as .pdf , .docx , or .xlsx , install those Python packages in the same environment as beyin. For uv tool install users, rerun the install with all desired extras in one command, for example: uv tool install --reinstall --with pdfplumber --with openpyxl beyin If you already manage a dedicated Python environment and explicitly want beyin inside it, pip install beyin still works, but it is a fallback path rather than the recommended default. ffmpeg is required for video and audio sources. Skip if you only use articles and local files: # macOS brew install ffmpeg # Linux sudo apt install ffmpeg # Windows winget install ffmpeg No Homebrew on macOS or winget not working? Download directly from ffmpeg.org/download.html. If you are on Windows and plan to use beyin through Codex MCP, prefer WSL over native Windows. See Windows notes for the recommended setup and details. You can use beyin directly in your terminal without MCP: beyin If you want to verify runtime dependencies before building anything: beyin check-deps If this is a fresh install, your first text build may ask for

Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.

공유

관련 저널 읽기

전체 보기 →