Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path::Don’t learn to code advises Jensen Huang of Nvidia. Thanks to AI everybody will soon become a capable programmer simply using human language.

  • Blackmist@feddit.uk
    cake
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I don’t think he’s seen the absolute fucking drivel that most developers have been given as software specs before now.

    Most people don’t even know what they want, let alone be able to describe it. I’ve often been given a mountain of stuff, only to go back and forth with the customer to figure out what problem they’re actually trying to solve, and then do it in like 3 lines of code in a way that doesn’t break everything else, or tie a maintenance albatross around my neck for the next ten years.

    • I Cast Fist@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Yesterday, I had to deal with a client that literally contradicted himself 3 times in 20 minutes, about whether a specific Date field should be obligatory or not. My boss and a colleague who were nearby started laughing once the client went away, partly because I was visibly annoyed at the indecision.

    • curiousaur@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      That’s how this statement and the state of the industry feels. The ai tools are empowering senior engineers to be as productive as a small team, so even my company laid off all the junior engineers.

      So who’s coming up behind the senior engineers? Is the ai they use going to take the reigns when they retire? Nope, the companies will be fucked.

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    As a developer building on top of LLMs, my advice is to learn programming architecture. There’s a shit ton of work that needs to be done to get this unpredictable non deterministic tech to work safely and accurately. This is like saying get out of tech right before the Internet boom. The hardest part of programming isn’t writing low level functions, it’s architecting complex systems while keeping them robust, maintainable, and expandable. By the time an AI can do that, all office jobs are obsolete. AIs will be able to replace CEOs before they can replace system architects. Programmers won’t go away, they’ll just have less busywork to do and instead need to work at a higher level, but the complexity of those higher level requirements are about to explode and we will need LLMs to do the simpler tasks with our oversight to make sure it gets integrated correctly.

    I also recommend still learning the fundamentals, just maybe not as deeply as you needed to. Knowing how things work under the hood still helps immensely with debugging and creating better more efficient architectures even at a high level.

    I will say, I do know developers that specialized in algorithms who are feeling pretty lost right now, but they’re perfectly capable of adapting their skills to the new paradigm, their issue is more of a personal issue of deciding what they want to do since they were passionate about algorithms.

    • gazter@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      In my comment elsewhere in the thread I talk about how, as a complete software noob, I like to design programs by making a flowchart first, and how I wish the flowchart itself was the code.

      It sounds like what I’m doing might be (super basic) programming architecture? Where can I go to learn more about this?

      • fidodo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Look up visual programming languages. When you apply a visual metaphor to programming it really is basically just really detailed and complex flow charts.

  • rottingleaf@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I think this is bullshit regarding LLMs, but making and using generative tools more and more high-level and understandable for users is a good thing.

    Like various visual programming means, where you sketch something working via connected blocks (like PureData for sounds), or in Matlab I think one can use such constructors to generate code for specific controllers involved in the scheme, or like LabView.

    Or like HyperCard.

    Not that anybody should stop learning anything. There’s a niche for every way to do things.

    I just like that class of programs.

    • gazter@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      As someone who’s had a bit of exposure to PLCs and ladder logic, and dabbled in some more ‘programming’ type languages, I would love to find some sort of ‘language’ that fits together like ladder logic, but for more computery type applications.

      I like systems, not programs. Most of my software design is done by building a flowchart, then stumbling around trying to figure out how to write that into code. I feel it would be so much easier if I could just make the flowchart be the code.

      I want a grown up Scratch.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        In some sense this is regressive, but I agree that ladder logic is more intuitive.

        I hated drawing flowcharts in university, but at this point have learned that if you understand what you’re doing, you can draw a flowchart. If you don’t, you shouldn’t have written that program.

        So yeah.

        I think the environment to program “Buran” used such a language (that is, they’d draw flowcharts instead of code).

  • Sibbo@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Founder of company which makes major revenue by selling GPUs for machine learning says machine learning is good.

    • Murvel@lemm.ee
      cake
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      4 months ago

      Yes but Nvidia relies heavily on programmers themselves. Without them Nvidia wouldn’t have a single product. The fact that he despite this makes these claims is worth taking note.

  • LainTrain@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    I’m so sick of the hyper push for specialization, it may be efficient but it’s soul crushing. IDK maybe it’s ADHD but I’d rather not do any one thing for more than 2 weeks.

    • wahming@monyet.cc
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      What hyper push? I can’t think of a time in history ever when somebody with two weeks of experience was in demand

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        Most people got jobs with no experience, or even education less than 40 years ago as long as they showed up and acted confident. Nowadays entry level internships want MScs and years of work xp with something that was invented yesterday

  • 3volver@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    Don’t tell me what to do. Going to spend more time learning to code from now on, thanks.

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    It’s not really about the coding, it’s about the process of solving the problem. And ai is very far away from being able to do that. The language you learn to code in is probably not the one you will use much of you life. It will just get replaced by which ai you will use to code.

  • Wooki@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    4 months ago

    This overglorified snake oil salesman is scared.

    Anyone who understands how these models works can see plain as day we have reached peak LLM. Its enshitifying on itself and we are seeing its decline in real time with quality of generated content. Dont believe me? Go follow some senior engineers.

      • thirteene@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        There is a reason they didn’t offer specific examples. LLM can still scale by size, logical optimization, training optimization, and more importantly integration. The current implementation is reaching it’s limits but pace of growth is also happening very quickly. AI reduces workload, but it is likely going to require designers and validators for a long time.

        • Wooki@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          4 months ago

          For sure evidence is mounting that model size benefit is not returning the quality expected. Its also had the larger net impact of enshitifying itself with negative feedback loops between training data, humans and back to training. This one being quantified as a large declining trend in quality. It can only get worse as privacy, IP laws and other regulations start coming into place. The growth this hype master is selling is pure fiction.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            But he has a lot of product to sell.

            And companies will gobble it all up.

            On an unrelated note, I will never own a new graphics card.

            • Wooki@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              4 months ago

              Secondhand is better value, still new cost right now is nothing short of price fixing. You only need look at the size reduction in memory since A100 was released to know what’s happening to gpu’s.

              We need serious competition, hopefully intel is able to but foreign competition would be best.

              • msage@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 months ago

                I doubt that any serious competitor will bring any change to this space. Why would it - everyone will scream ‘shut up and take my money’.

      • Wooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        4 months ago

        Fediverse is sadly not as popular as we would like sorry cant help here. That said i follow some researchers blogs and a quick search should land you with some good sources depending on your field of interest

      • Wooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        4 months ago

        You asked the question already answered. Pick your platform and you will find a lot of public research on the topic. Specifically for programming even more so