Skip Navigation

Posts
12
Comments
318
Joined
3 yr. ago

  • AI is an extremely useful tool. I’m on the other end of the career track, but it seems to be that it’s almost like having a personal tutor. And as with any other teacher, if you use it as an aid to figure things out yourself I imagine it would help immensely, but at the same time if you use it as a crutch to do your work for you, you skills will be as weak as someone who cheats off their friends in school. I attribute a large part of my skills to spending lots of time reading other people’s code and understanding why they wrote it the way they did (usually because some library didn’t do what I wanted so I figured out how to beat it into submission out of pure stubbornness). If you use the AI as an aid and spend the time to really understand the code it’s producing (and the flaws in that code), I think you’ll build up your skills well.

    My rant about code monkeys was inspired by people I’ve interviewed and worked with who had to be told exactly how to solve a problem since they apparently had zero problem solving skill themselves. The “programming is just writing code” attitude drives me up the fucking wall and “LLMs are going to make programmers obsolete” is just the latest iteration of that bullshit.

  • Knowing how to write code has only ever been half (or less) of the job. A real programmer solves problems with code, especially problems that aren’t like any they’ve seen before. Someone who can write code but can’t solve problems or can only ‘solve’ problems they’ve seen before is just a code monkey. AI can regurgitate code it’s seen before (that is, code it was trained on) and it can do some problem solving but it falls on its face quickly if you ask it to do anything complex (at least for my metric of what is complex).

  • The core bug was that they were reading from a map without checking if the map entry existed. Given a non-nil var m map[K]V, m[key] always succeeds (never panics). If the given entry doesn’t exist, it returns the zero value for the type, i.e. var v V. If V is a pointer type, accessing a field or method will panic (because the zero value is nil). If V is a struct or other value type, it can be used normally. That bug is on them. Any Go developer who isn’t a novice should know how maps and value types behave.

  • oof

    Jump
  • Seniors should know their shit. If a junior doesn’t need help they’re either not doing their job or not a junior.

    I think you haven’t met “problem solvers” as creative as the ones I’ve met. My first job out of college I built an inventory system for a small engineering firm. One of the engineers tried to solve his problem instead of asking for help. Once he gave up and called us, it took us an entire day just to figure out how he had managed to screw things up as badly as he did.

  • oof

    Jump
  • That’s preferable to people who don’t ask for help until everything is hopelessly fucked because they kept trying to solve their problem different git commands, none of which they understood.

  • Flash was awful. I was contracted to un-fuck a custom video player and that experience convinced me that Flash was a dumpster fire that needed to die. Fortunately it did.

  • Almost any language is ok but some ecosystems make me want to turn into a murder hobo (looking at you, JavaScript).

  • Keep your Rust to yourself. I don’t care what language someone else uses for their projects but Rust is an unreadable mess that I don’t want anywhere near my projects.

  • Any function can be written in any Turing complete programming language. That doesn’t mean a sane person would use malboge or brainfuck for a production system. Language choice can have a huge impact on productivity and maintainability and time is money.

  • Indentation-driven control flow is one of the most cursed things ever invented, excluding things explicitly designed to inflict pain or death.

  • If you own the company, no one can force you to sell shares.

  • money

    Jump
  • Devs who are devs for no other reason than money and who don’t give a shit about the quality of their work are a problem.

  • In my experience VSCode on Windows runs like dogshit. I blame Windows for that. VSCode on Linux runs like a dream. I can have four different sessions open and it still runs great (I haven’t tested more than that because I’ve never had a reason to).

  • I make my code open source and public so people can use it if they find it useful, not because I expect anyone to contribute.

    And there’s a big fucking difference between actively hostile and “I’m not interested in accepting this change”.

  • This is the only one I subscribe to that has memes. I was not being precise but I know I have seen this before, recently, more than once.

  • He can know about it as a concept without really understanding it and if he he treats the dev team as a code producing machine then he could be ignorant of how much technical debt there is. Or maybe there’s an asshat on the dev team telling him there’s no technical debt.

  • The boss probably isn’t lying about no technical debt. He’s probably just too dumb or ignorant to know about it.

  • That would be a reasonable take if this hadn’t been reposted twice in the last month

  • LLVM

    Jump
  • Honestly I didn’t really follow OP’s meme or care enough to understand it, I’m just here to provide some context and nuance. I opened the comments to see if there was an explanation of the meme and saw something I felt like responding to.

    Edit: Actually, I can’t see the meme. I was thinking of a different post. The image on this one doesn’t load for me.

    “The answer we’ve all been waiting for” is a flawed premise. There will never be one language to rule them all. Even completely ignoring preferences, languages are targeted at different use cases. Data scientists and systems programmers have very different needs. And preferences are huge. Some people love the magic of Ruby and hate the simplicity of Go. I love the simplicity of Go and hate the magic of Ruby. Expecting the same language to satisfy both groups is unrealistic because we have fundamentally different views of what makes a good language.

  • LLVM

    Jump
  • It is being used. Objective-C (used for macOS and iOS apps) has used reference counting since the language was created. Originally it was manual, but since 2011 it's been automatic by default. And Swift (which basically replaced Objective-C) only supports ARC (does not support manual reference counting). The downside is that it doesn't handle loops so the programmer has to be careful to prevent those. Also, the compiler has to insert reference increment and decrement calls, and that's a significant engineering challenge for the compiler designers. Rust tracks ownership instead of references, but that means it's compiler is even more complicated. Rust's system is a little bit like compile-time reference counting, but that's not really accurate. Apparently Python, Pearl, and PHP use reference counting, plus tracing GC (aka 'normal' GC) in Python and PHP to handle cycles. So your implicit statement/assumption that reference counting is not widely used is false. Based on what I can find online, Python and JavaScript are by far the most used languages today and are roughly equal, so in that respect reference counting GC is equally or possibly more popular than pure tracing GC.