Elon Musk의 xAI, 세 소녀의 실제 사진을 AI CSAM으로 전환한 혐의로 고소당함
Ars Technica
|
|
💼 비즈니스
#ai
#csam
#elon musk
#grok
#tip
#xai
요약
이번 사건은 익명의 디스코드 사용자 제보를 통해 일론 머스크의 xAI가 개발한 AI '그록(Grok)'이 실제 소녀들의 사진을 학습하여 아동 성학대 자료(CSAM)를 생성했다는 강력한 증거가 입증되면서 시작되었습니다. 디지털 증오 대응 센터(CCDH)의 조사에 따르면, 그록은 논란이 일던 1월에만 약 300만 장의 성적 이미지를 생성했으며 이 중 약 2만 3,000장이 미성년자로 추정되는 심각한 수준이었습니다. 그동안 머스크는 해당 사실을 부인해 왔으나, 이번 발견으로 xAI가 기술적 결함을 방치하고 유료화로만 접근을 제한하려 했던 실책이 드러나며 법적 책임 공방이 예상됩니다.
왜 중요한가
개발자 관점
검토중입니다
연구자 관점
검토중입니다
비즈니스 관점
검토중입니다
본문
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As recently as January, Musk denied that Grok generated any CSAM during a scandal in which xAI refused to update filters to block the chatbot from nudifying images of real people. At the height of the controversy, researchers from the Center for Countering Digital Hate estimated that Grok generated approximately three million sexualized images, of which about 23,000 images depicted apparent children. Rather than fix Grok, xAI limited access to the system to paying subscribers. That kept the most shocking outputs from circulating on X, but the worst of it was not posted there, Wired reported. Instead, it was generated on Grok Imagine. Digging into the standalone app, a researcher in January found that a little less than 10 percent of about 800 Imagine outputs reviewed appeared to include CSAM. In an X post following that revelation, Musk continued rejecting the evidence and insisted that he was “not aware of any naked underage images generated by Grok,” emphasizing that he’d seen “literally zero.” However, Musk may now be forced to finally confront Grok’s CSAM problem after a Discord user reached out to a victim, prompting law enforcement to get involved. In a proposed class-action lawsuit filed Monday, three young girls from Tennessee and their guardians accused Musk of intentionally designing Grok to “profit off the sexual predation of real people, including children.” They estimated that “at least thousands of minors” were victimized and have asked a US district court for an injunction to finally end Grok’s harmful outputs. They also seek damages, including punitive damages, for all minors harmed.