Uber는 개발에 AI를 사용합니다: 내부 모습

hackernews | | 🔬 연구
#ai #claude #review #uber #개발 #내부도구 #에이전트
원문 출처: hackernews · Genesis Park에서 요약 및 분석

요약

I'm sorry, but I only have access to the title of the article "Uber uses AI for development: inside look" without the full content. To create a meaningful summary that adds new information beyond the title, I would need the actual article text. Could you please provide the full content of the article so I can summarize it properly?

본문

How Uber uses AI for development: inside look How Uber built Minion, Shepherd, uReview, and other internal agentic AI tools. Also, new challenges in rolling out AI tools, like more platform investment and growing concern about token costs Before we start: all The Pragmatic Summit videos are now available to view. Paid newsletter subscribers also have access to each session with the Q&A session, as well. Update on 11 March: the Uber team shared updated numbers as of March 2026: 84% of devs at Uber are agentic coding users (either using CLI-based agents or making more agentic requests than tab-completion in their IDE) 65-72% of code is AI-generated inside IDE-based tools. This number is, naturally, 100% for AI command line tools like Claude Code. Claude Code usage nearly doubled in 3 months — from 32% in Dec to 63% in Feb, while IDE-based tools (Cursor, IntelliJ) have plateaued. The article is updated with these new and correct numbers. I spent four years working at Uber until 2020 and experienced firsthand the company’s standout engineering culture. Uber is a company that did the speed run of going from a small startup, through hypergrowth, to being a large company facing major risk during the pandemic, when the rideshare business briefly collapsed. Today, it’s maturing as a publicly traded, profitable company, and employs almost 3,000 people in the tech function. At the recent Pragmatic Summit in San Francisco, one of the most interesting behind-the-scenes sessions came from the ridesharing company’s principal engineer, Ty Smith, and director of engineering Anshu Chada, who pulled back the curtain on what Uber has been doing with AI tools, internally. They were candid about the amount of work it took to build up Uber’s internal “AI stack,” why all that work was necessary, and also discussed the drawbacks as well as benefits of this rapidly spreading technology. You can watch their presentation at The Pragmatic Summit here. In today’s issue, we cover: Agentic layers & systems. Four layers spanning an internal AI platform, context sources, industry tools, and specialized agents for testing and code review. Internal tooling: MCP Gateway, Uber Agent Builder, and the AIFX CLI. Uber built several internal tools to make it easier for devs to use AI tools, and to make internal AI agents more effective. How AI changes developer workflows. A move away from single-threaded coding in an IDE, to orchestrating multiple parallel agents. Engineers naturally gravitate toward kicking off new agents, which starts to create resource and cost challenges. Minion: running background agents at scale. Uber built Minion, an internal background agent platform with monorepo access and optimized defaults. It’s a clever abstraction layer that works well in practice. New internal dev tools. More AI-generated code means more code reviews and more noise, so Uber built Code Inbox for smart PR routing, uReview for high-signal AI code review comments, Autocover for generating 5,000+ unit tests per month, and Shepherd for managing large-scale migrations end to end. Challenges. AI adoption is slower than expected, even at a forward-thinking company like Uber. Top-down mandates are less efficient than engineers sharing their wins with peers. Impact in numbers. 92% of Uber devs use agents monthly, 65-72% of code is AI-generated inside IDEs, and 11% of pull requests opened by agents. At the same time, AI-related costs are up 6x since 2024, and token cost optimization is a growing priority. Longtime readers might recall we’ve covered Uber’s engineering culture over time: Developer Experience at Uber – with Uber’s founding engineer on Developer Platform, Gautam Korlam (2025) Let’s get into it: AI is not new at Uber, but rolling it out companywide is. The company has used machine learning and AI technologies in many systems, including its Marketplace platform, which are responsible for routing and matching drivers with riders, forecasting demand, etc. What is relatively new at nearly all tech companies is the process of integrating AI across engineering and beyond. The official strategy at the ridesharing giant is to become a “GenAI-powered” company: I appreciate Uber sharing this approach openly because while most companies say that they want to be “AI-powered” – however cliche that claim might be – not all provide as much transparency. It’s worthwhile engineers internalizing how leadership views AI. These folks, in general, see a tool that can bring efficiency everywhere. My take is that in some ways, AI is seen similarly to the cloud, which has been perceived as a means to reduce costs and improve the flexibility and elasticity of hardware resources. Today, AI is seen as the way to increase efficiency and lower costs, such as customer support, software development, the finance function – or any function. Uber is focusing not on automating everything possible in engineering. Instead, it wants to: Eliminate toil: helping AI do “boring” work like upg

Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.

공유

관련 저널 읽기

전체 보기 →