AI가 랜딩 페이지를 재설계하게 해드립니다. 인간이 디자인한 버전을 능가했습니다
hackernews
|
|
🔬 연구
#ai 디자인
#claude
#crazy egg
#review
#ui/ux
#랜딩 페이지
#웹 분석
원문 출처: hackernews · Genesis Park에서 요약 및 분석
요약
An AI-powered redesign of a landing page outperformed the original human-designed version in key metrics. The AI-driven changes successfully increased conversion rates compared to the previous design developed by human designers. This result highlights the potential for AI tools to deliver superior optimization in web development tasks.
본문
I’ll be upfront: I wasn’t sure this would work. The idea was simple. Take Crazy Egg’s existing Web Analytics landing page (written and designed by an experienced human team) and build a competing version using AI. Then A/B test them against each other and see which converts better. The AI landing page had a 44.83% conversion lift. At 99% confidence. For conversion rate optimization, a 5-10% lift is considered a good test result. A 44% lift at that confidence level is the kind of number that makes you stop and rethink how you’ve been building pages. Here’s everything I did, what the data showed, and why I think the AI version won. Key Insights From the A/B Test - The AI-generated page converted at 80.65% vs the human page’s 55.68% - That’s a 44.83% relative lift, at 99% statistical confidence - The AI page led with visitor outcomes instead of product features - It included a competitive comparison table that the human page didn’t have - The workflow took a fraction of the time a traditional redesign would What We Were Testing and Why Crazy Egg’s Web Analytics page was already live and performing. It wasn’t broken. That’s actually what made it an interesting candidate for a test. Our goal wasn’t to fix a failing page. Instead, we wanted to see whether AI-assisted design could beat a competently built one. The conversion event we measured was signups. Visitors were split fairly evenly across both variants (just under 55,000 people per variant), and we tracked who landed on the web analytics page and who signed up from that group. Variant A was the existing human-designed page. Variant B was the AI version. We ran the test for about three weeks in total. How I Built the AI Version This is the part most “AI vs human” posts skip over. So let me be specific. The workflow had four stages and used only two AI tools, Claude and Base44. Stage 1: AI-generated page structure and copy From a previous experiment on the best and worst UX prompts, I learned that one of the best ways to prompt AI website builders is to use another AI tool to generate a well-structured, detailed prompt. So I did exactly that. I asked Claude to generate a detailed prompt outlining the page structure, section by section, along with the copy for each section. Claude asked some clarifying questions and then got to work. It generated over 1,000 words of content for the landing page before creating the prompt. The prompt itself was structured, specific, and long. It was quite similar to a product requirements brief rather than a basic prompt and ended up being over 2,300 words. I did not edit the content or the prompt. In fact, I didn’t even read either before prompting Base44. I wanted to keep my involvement in this process as minimal as possible while also giving Claude enough information about the brand so it didn’t completely go off the rails. Stage 2: Base44 builds the design I pasted the exact prompt Claude generated into Base44, a vibe-coding platform, which generated the full-page design. If you’ve read our earlier piece on AI website builders, you’ll know Base44 was the top performer in our UX test. Given the results of this experiment, it continues to prove its quality and value for AI website design. The Crazy Egg team made a few minor stylistic changes (e.g., using a custom font) and adapted the design for desktop, mobile, and tablet views. 99% of the design was kept true to Base44’s version. Stage 3: AI critique and second iteration I took a full-page screenshot of Base44’s first design and asked Claude whether it had any follow-up edits to request. I also requested two things to ensure branding consistency and accurate information: images of the actual product dashboard and factually correct information for social proof sections. Claude took care of the rest, flagging several areas for improvement and implementing the changes I requested for brand accuracy. Then it used its own feedback to generate a second prompt, which I entered into Base44 to get the final landing page we used in the A/B test. Stage 4: Brand-safety review by Crazy Egg The Crazy Egg team reviewed the final design and its content for accuracy and brand compliance. They made minor adjustments (like swapping in correct customer logos or fixing a couple of product details), but nothing that would affect conversion behavior. The page structure, copy, and layout were left intact. From first prompt to final page, the whole process took a fraction of the time a traditional landing page redesign would require. Possible Reasons Why the AI Page Won Looking at both pages side by side, the performance gap isn’t hard to explain. The AI version made several structural and copy choices that align closely with established conversion principles, choices the human-designed page had either skipped or handled more loosely. Here’s what stood out. 1. Above the Fold: The AI Version Does a Lot More Work Before the First Scroll Above the fold is prime real estate. It’s what every visitor see
Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.
공유