AI를 어떻게 다루나요?

hackernews | | 🔬 연구
#github #review #unicode #보안 #파일관리
원문 출처: hackernews · Genesis Park에서 요약 및 분석

요약

Based on the title and common Hacker News discussions, the article likely addresses practical strategies for navigating AI tools ethically and effectively. It probably emphasizes understanding AI's limitations, setting clear boundaries for use, and developing critical thinking skills to evaluate AI outputs. Key advice likely involves focusing on human oversight and responsible application to avoid potential pitfalls.

본문

- - Save wwsmiff/6fa29c2f13e58578aa48e6f609da4d15 to your computer and use it in GitHub Desktop. This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters | (This post is written entirely by a human, without any AI assistance, just my thoughts and questions) | | | This will be cross posted to other platforms, to get more opinions. | | | Background: | | | I'm currently a university student pursuing a degree in Computer Science bound to graduate in 2027. Also do note that, | | | I do not have any industry experience, and the closest thing I have to that is a few open source contributions and | | | hackathon wins, so I imagine a lot of my views and thoughts might be faulty, please correct me if thats the case. | | | I have been programming from high school and I really enjoyed this field a lot and I've tried out multiple different | | | domains and am currently interested in low-level programming, systems programming, embedded systems, graphics programming, | | | etc. you get the gist. I have also tried the SOTA models and it truly is impressive for building quick prototypes | | | where you dont know the field at all and do not want to invest time to first learn about it thoroughly and then implement it | | | without knowing if the idea is even viable, and similar things. But for familiar fields, where you really wanna learn and | | | understand what you're doing, it really sucks out the fun. So far I've obviously been programming by hand and I really | | | enjoyed the entire process of it and didn't feel frustrated doing any part of it even if it was something as mundane | | | as setting up the build system for a project. But overnight AI (by "AI" I am specifically referring to only LLMs | | | throughought this post.) came along and drastically changed everything. Now writing code by hand is seen almost as | | | a "bad" thing if you wanna get into the industry and everything is just about how fast can you ship things, etc. | | | While I agree that software engineering is far more than just "programming"/"coding" I feel that this part of the | | | process brought me great joy and allowed me to think deeply about every single thing I was doing to get my projects | | | to fruition. But now everyone is shilling AI and especially this phrase: "Use AI or you'll be left behind" even said | | | by people I deeply respect like antirez and a few others who I thought would actually be against AI assisted programming. | | | Now I will come back to this phrase later. It feels like engineering is undervalued and maybe even just dead and the | | | industry is shifting from core engineering principles to just rapid iterations on new ideas and rooting heavily for | | | startups and such. But yeah this entire shift in programming is really sucking out the motivation from software | | | engineering for me, and I have some questions for which, I am unable to find satisfactory answers so far. | | | Questions: | | | 1. Regarding the phrase "Use AI or you'll be left behind", how would this realistically be true? For the foreseeable future, | | | the whole point of AI is to eliminate writing code entirely and make tasks that deal with producing and maintaining software | | | much easier, so wouldn't this idea just be contradictory, because if I have strong fundamentals and leverage AI tools, | | | wouldn't I just be able to be much more productive in the future as these tools are simply only getting better and making the | | | whole job easier, as compared to someone with little to no experience with computer science? | | | 2. Also, how does AI make a developer more productive? So far, from what I've read and heard, when trying to contribute | | | meaningfully to any codebase, you take reponsibility for your code whether written by hand, or generated using AI, | | | which would mean you need to understand whatever it is, that you're adding to the codebase, and from my experience, | | | reading and reasoning about code that is written by you is far easier than reading and understanding code that isn't written | | | by you, so wouldn't the actual bottleneck be reviewing the code which would practically take almost the same amount of time as compared | | | to just writing it by hand? | | | 3. Now, there are two classes of "software engineers" as I see it. One that rapidly iterates on features and ideas, | | | uses AI most of the time and keeps the company and middle/upper management happy. And the other is the one that maintains | | | tools like curl, ffmpeg, linux, etc. If the world moves towards the former class of software engineers, who will maintain the | | | aforementioned tools? as mass-produced AI-written code is only viable because these tools are rock hard and built with | | | high quality engineering, so how will software enginee

Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.

공유

관련 저널 읽기

전체 보기 →