HN 표시: iOS SimulatorCamera - iOS 시뮬레이터와 함께 MacBook 카메라를 사용하세요

hackernews | | 📦 오픈소스
#뉴스
원문 출처: hackernews · Genesis Park에서 요약 및 분석

요약

아직까지 실제 카메라를 지원하지 않는 iOS 시뮬레이터의 단점을 해소하기 위해, 맥북 카메라나 비디오 파일을 연동할 수 있는 'SimulatorCamera'가 개발되었습니다. 이 도구는 macOS 컴패니언 앱과 iOS Swift 패키지로 구성되어 로컬호스트를 통해 초당 25~30프레임의 속도로 데이터를 전송합니다. 덕분에 별도의 기기나 케이블 없이 시뮬레이터 환경에서도 AR, 바코드 인식, Core ML 등 카메라 기반 기능을 원활하게 테스트할 수 있게 되었습니다.

본문

Plug a real camera, a video file, or your screen into the iOS Simulator. Finally. The iOS Simulator has never supported a real camera. AVCaptureDevice is empty. Every app that touches the camera — QR scanners, barcode readers, document capture, ML pipelines, AR prototypes — either stubs out the camera path, runs only on device, or ships a brittle "use a photo instead" fallback. SimulatorCamera is a tiny two-piece developer tool that fixes it: - a macOS companion app that streams video frames over localhost:9876 using a compact binary protocol (SCMF — Simulator Camera Message Format), and - an iOS Swift Package with an AVCaptureSession -shaped API. On device it compiles to a no-op. Frames show up in your app. Vision, VisionKit, Core ML, barcode detection, custom pipelines — the SDK is designed to drive them in the Simulator at 25–30 FPS over localhost, no device, no cables, no private APIs. Because Apple never added a way the camera from the Simulator. A must have for every developer. SimulatorCamera finally fill that gap. - 🎥 Live video into the Simulator at 30 FPS via localhost TCP - 🧩 Drop-in SDK — FrameSource mirrorsAVCaptureSession semantics (start() ,stop() , delegate,CVPixelBuffer callbacks) - 🔌 Sources on the Mac: test pattern (built-in), webcam, video file, screen region (roadmap) - 📦 One-line install via Swift Package Manager - 🛡 No private APIs — Network.framework +CoreVideo +ImageIO - 📵 Zero overhead on device — #if targetEnvironment(simulator) -guarded - 🔐 Localhost-only by default - 🧪 Vision / Core ML ready — frames land as CVPixelBuffer dependencies: [ .package(url: "https://github.com/Akylas/SimulatorCamera.git", from: "1.0.0"), ], targets: [ .target( name: "MyApp", dependencies: [ .product(name: "SimulatorCameraClient", package: "SimulatorCamera"), ] ), ] Or in Xcode: File → Add Package Dependencies… → paste the repo URL. Homebrew (recommended): brew tap akylas/simulatorcamera https://github.com/Akylas/SimulatorCamera brew install simulatorcamera open -a SimulatorCameraServer Or grab the signed & notarized .dmg from Releases. Or build from source: git clone https://github.com/dautovri/SimulatorCamera.git cd SimulatorCamera brew install xcodegen xcodegen generate --spec apps/MacServer/project.yml open apps/MacServer/SimulatorCameraServer.xcodeproj - Launch SimCameraServer.app on your Mac. Pick a source and click Start. - In your iOS code: import SimulatorCameraClient final class CameraController: NSObject, FrameSourceDelegate { private let source: FrameSource override init() { #if targetEnvironment(simulator) source = SimulatorCameraSession(host: "127.0.0.1", port: 9876) #else source = AVCaptureFrameSource() // your existing AVCapture wrapper #endif super.init() source.delegate = self source.start() } func frameSource(_ source: FrameSource, didOutput pixelBuffer: CVPixelBuffer, at time: CMTime) { // Feed to Vision, Core ML, preview layer, whatever. } } The shim now mirrors the whole AVCaptureSession → addInput → addOutput → startRunning dance. Your existing camera-setup code ports over by prefixing each type with Simulator : import SimulatorCameraClient SimulatorCamera.configure(host: "127.0.0.1", port: 9876) let session = SimulatorCaptureSession() session.sessionPreset = .hd1280x720 guard let device = SimulatorCaptureDevice.default(for: .video) else { return } let input = try SimulatorCaptureDeviceInput(device: device) session.addInput(input) let output = SimulatorCameraOutput() // AVCaptureVideoDataOutput-shaped output.setSampleBufferDelegate(self, queue: frameQueue) session.addOutput(output) session.startRunning() // kicks off the network session Your existing captureOutput(_:didOutput:from:) delegate fires with a valid CMSampleBuffer wrapping a CVPixelBuffer — same code path as the real device. If you already have an AVCaptureVideoDataOutputSampleBufferDelegate , swap the output for SimulatorCameraOutput inside a simulator guard and keep your delegate code unchanged. The standard captureOutput(_:didOutput:from:) method fires with a real CMSampleBuffer — SimulatorCameraOutput is an AVCaptureVideoDataOutput subclass, so the first argument is a genuine AV output, not a stand-in: #if targetEnvironment(simulator) let output = SimulatorCameraOutput() output.setSampleBufferDelegate(self, queue: myQueue) SimulatorCamera.start() #else let output = AVCaptureVideoDataOutput() output.setSampleBufferDelegate(self, queue: myQueue) session.addOutput(output) #endif Or use the drop-in SwiftUI view: import SwiftUI import SimulatorCameraClient struct ContentView: View { var body: some View { SimulatorCameraPreviewView() } } +--------+---------------+------------------+--------+---------+----------+ | magic | payloadLength | timestamp | width | height | jpegData | | 4 B | 4 B uint32 LE | 8 B Float64 LE | 4 B LE | 4 B LE | N bytes | | "SCMF" | | +--------+---------------+------------------+--------+---------+----------+ Full spec: docs/PROTOCOL.md · architecture: docs/ARCHITECTURE.md · roadmap: docs/ROADMAP.md. SimulatorCamera/ ├── Package.swift # SwiftPM manifest (exposes SimulatorCameraClient) ├── Sources/SimulatorCameraClient/ # the iOS SDK ├── Tests/SimulatorCameraClientTests/ # unit tests for the SCMF codec ├── apps/ │ ├── MacServer/ # SwiftUI macOS companion app │ └── iOSDemo/ # sample iOS app using the SDK ├── docs/ │ ├── PROTOCOL.md # wire format │ ├── ARCHITECTURE.md # threading, transport, failure modes │ └── ROADMAP.md ├── Casks/simulatorcamera.rb # Homebrew cask formula ├── scripts/ │ ├── bootstrap.sh # swift build + test │ └── build-release.sh # archive + codesign + notarize + .dmg/.zip ├── .github/ │ ├── FUNDING.yml # GitHub Sponsors / BMC │ └── workflows/ │ ├── ci.yml # SwiftPM CI on macos-14 │ └── release.yml # tag-driven signed release └── RELEASING.md # release runbook ./scripts/bootstrap.sh # swift build && swift test brew install xcodegen xcodegen generate --spec apps/MacServer/project.yml xcodegen generate --spec apps/iOSDemo/project.yml v0.2.0 — "Use my real camera." First stable release with a drop-in AVCaptureSession shim and live Mac webcam source. See CHANGELOG.md and docs/RELEASE_NOTES_v0.2.0.md. See CONTRIBUTING.md. Good first issues are labelled on the tracker. For release mechanics, see RELEASING.md. SimulatorCamera is fully MIT-licensed and maintained on donations. If it saves you a device-build loop, consider sponsoring or buying a coffee. No paid tier, no license keys, no telemetry — just a tip jar. MIT — see LICENSE.

Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.

공유

관련 저널 읽기

전체 보기 →