HN 표시: ARKit 얼굴 추적을 사용하여 눈으로 A/B 테스트 이미지 보기
hackernews
|
|
🔬 연구
#a/b 테스트
#arkit
#review
#ux
#눈 추적
#시선 추적
원문 출처: hackernews · Genesis Park에서 요약 및 분석
요약
아이폰의 ARKit TrueDepth 카메라를 활용한 ‘Saccade’ 앱이 출시되어, 사용자의 무의식적인 시선 추적을 통해 이미지 선택을 돕는 혁신적인 A/B 테스트 도구를 제공합니다. 두 개의 이미지를 보여주고 사용자의 시선이 머무는 시간을 분석하며, 연구에 따르면 시선 지속 시간은 선호도와 강한 상관관계가 있습니다. 설문이나 클릭 없이 순식간에 순위를 매길 수 있어 로고, 썸네일, 사진 등 다양한 시각 자료의 선호도를 판단하는 데 유용하며, 모든 데이터는 기기 내에서 즉시 처리되어 보안 문제도 없습니다.
본문
saccade · /səˈkäd/ · rhymes with “facade” A rapid movement of the eye between fixation points. No surveys. No clicks. No opinions. Just look at two images and your unconscious gaze picks the winner. Two logos. Four hero images. Eight headshots. Drop in 2–12 images and Saccade generates every possible A/B pair. Each pair appears for 3–8 seconds. Your iPhone's TrueDepth camera tracks where your gaze lingers. You don't choose — your eyes do. Instant ranked results with preference strength bars. Share the results card with your team, your client, or your group chat. You can't fake where your eyes go. Research shows gaze duration correlates with preference — even when you know you're being tracked. It's called the gaze cascade effect, and it's been used in commercial eye-tracking studies for decades. Shimojo, S., et al. (2003). Gaze bias both reflects and influences preference. Nature Neuroscience, 6(12), 1317–1322. Saccade brings lab-grade eye tracking to your pocket. Not 100% accurate — but consistently better than guessing, and infinitely faster than a survey. Which version of your logo, icon, or wordmark actually draws the eye? 20 shots from a shoot. Your eyes pick the best ones faster than your brain. Which landing page hero or YouTube thumbnail gets the longest look? Pass the phone around. Aggregate unconscious preferences instead of subjective opinions. Beach or mountains? Your eyes know where they want to go. Can’t decide? Show names as cards and let your gaze settle it. Which palette actually draws the eye? Find out in seconds. Same phrase, different typefaces. Your eyes pick the one that reads best. Saccade uses ARKit's TrueDepth camera to measure gaze direction. It's not lab-grade precision, but research shows gaze duration reliably correlates with visual preference. Think of results as a strong directional signal, not a definitive answer. In testing, it's consistently better than guessing and infinitely faster than a survey. Yes. The gaze cascade effect (Shimojo et al., 2003) shows that gaze preference persists even when people are aware of the tracking. You can't consciously control where your eyes linger. That's the whole point. No. Saccade reads numerical eye-direction values from ARKit in real-time. No photos, no video, no face data is ever stored. Everything is processed on-device and discarded immediately after each comparison. Any iPhone with a TrueDepth camera (iPhone X or later) running iOS 17 or later. The TrueDepth camera is required for ARKit face tracking. Up to 12 images per project. Saccade automatically generates every possible A/B pair. For example, 6 images = 15 pairs, which takes about 75 seconds to complete. Yes. Pass the phone to colleagues, friends, or clients and have each person run the same project. Each session produces independent results. Collaborative features with shared results are planned for a future update. Yes. No ads, no subscriptions, no in-app purchases. No account needed. The app is fully functional from the moment you install it. A saccade (/səˈkäd/, rhymes with "facade") is a rapid movement of the eye between fixation points. It's the fundamental unit of visual attention, and it's what this app measures.
Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.
공유