• elgordino@fedia.io
    link
    fedilink
    arrow-up
    59
    ·
    2 days ago

    “Our North Star is ‘1 engineer, 1 month, 1 million lines of code.’

    What could possibly go wrong

    • ashughes@feddit.uk
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      1 day ago

      Assuming a full-time employer engineer works 20 days a month, that’s 50,000 lines of code a day.

      Assuming an 8 hour work day, that’s 6,250 lines of code per hour, or 104 lines of code per minute.

      This is humanly impossible without using AI and automation at every stage of the process. Good luck with that.

      I’m guessing where we’re headed is software “engineers” becoming AI prompt “engineers” for design, development, review, testing, and shipping.

      Buckle up, shit’s gonna get wild.

    • msage@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      I mean, I don’t see the issue.

      Producing such amounts of code with AI should be a easy as pie.

      Obviously, through many outages and bugs present in the Windows and other Microsoft products, they could not give two shits about quality. And quantity is easily measured and also achieved with the Enormous Bullshit Spewer.

      I always wished for Microsoft to die, and if this is the way, I will clap for their efforts.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    1
    ·
    2 days ago

    Tech bosses have well and truly lost it.

    Consider that:

    • Code is primarily to communicate from human-to-human, and only incidentally for computers to execute
    • A codebase that is 30+ years old has an absolute shitload of learnings incorporated into it, much of it very subtle
    • Languages are, in fact, different. So some things cannot be directly translated with exactly the same semantics, so devs will need to fully understand the intent and resolve ambiguities
    • A million lines per month is a lot of text for someone to successfully interpret and translate without losing any subtleties
    • SpicyLizards@reddthat.com
      link
      fedilink
      arrow-up
      30
      ·
      edit-2
      2 days ago

      And 1 million lines to truly review. Reviewers are used to their own heuristics based on common and critical mistakes to find errors. I reckon AI errors won’t follow familair patterns, making reviews even more tedious.

        • SpicyLizards@reddthat.com
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          Well, being the job and all, yes. But also set expectations, which there cannot be any based on the nothing job ad. Woudlt touch that with a 10ft pole.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        23
        ·
        2 days ago

        Yeah…

        Human mistakes tend to 1) look like mistakes, and 2) are surrounded by lots of hints that the author had trouble with that section of code.

        AI mistakes tend to 1) look like regular code, and 2) look just as confident and effort-ful as the rest of the code.

    • FizzyOrange@programming.dev
      link
      fedilink
      arrow-up
      2
      arrow-down
      7
      ·
      1 day ago

      Code is primarily to communicate from human-to-human, and only incidentally for computers to execute

      Uhm what? No. That is a stupid thing to say. It is primarily intended for computers to execute, but in a way that humans can understand.

      • Mr. Satan@lemmy.zip
        link
        fedilink
        arrow-up
        10
        ·
        1 day ago

        It’s definitely for humans first and computers second. Compiled, machine code is for computers, everything else are tools so that humans don’t have to deal with machine code. An abstraction made by humans for humans to use.

        This is one of the issues I see with LLMs for code: instead of engineering and leveraging machine learning for optimizing specific problems, we’re now forcing text prediction engines to write human oriented text that happens to be a programming language.

        • FizzyOrange@programming.dev
          link
          fedilink
          arrow-up
          2
          arrow-down
          4
          ·
          1 day ago

          This is stupid. The whole point of programming is to make computers do things. Before computers, “code” was just hand wavy equations. Sum from 1 to n stuff.

          Yes it is designed so that humans can understand it, but the point is to make computers do stuff. Very obviously.

          • Mr. Satan@lemmy.zip
            link
            fedilink
            arrow-up
            3
            ·
            22 hours ago

            You wouldn’t mind writing machine code then? Ok, I’ll give you assembly. It’s all that’s needed to tell a computer what to do.

            • FizzyOrange@programming.dev
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              21 hours ago

              Of course I wouldn’t write in raw machine code, or even assembly. We invented higher level languages that are more powerful and easier for humans to use…

              But the purpose is still to make machines so stuff!!! I’m not just writing code so that other humans can marvel at my algorithms.

              This is so freaking dumb.

              • Mr. Satan@lemmy.zip
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                21 hours ago

                easier for humans to use…

                And that is my point. The primary purpose for all these abstractions is for humans to use. It’s first and foremost designed to be read and understood by humans, to make programming easier for humans.

                • FizzyOrange@programming.dev
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  10 hours ago

                  I mean yeah I guess that’s its primary purpose if you totally ignore the fundamental thing it’s meant to be doing.

                  It’s like saying the primary purpose of a seatbelt is to be easy to fasten and unfasten.

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            Make computers do stuff for what purpose?

            I joke to my family that I just name things for a living. When you take away all the incidental stuff like files and pointers and ports, that’s really all it is. “This sequence of events with these properties is called <this>, and when you ask our system what to do about it, it does this other sequence of events with these properties which we call <this other name>.”

            It’s kinda like those ancient stone tablets that are the first example of writing, and they’re just like “Ramses owes Jeremiah 5 chickens” or whatever. It’s just how we manage abstract concepts moving around our civilization. Yeah there’s math involved, but every endpoint is a human being in one way or another.

            • FizzyOrange@programming.dev
              link
              fedilink
              arrow-up
              1
              ·
              23 hours ago

              Make computers do stuff for what purpose?

              For whatever task you’re trying to get them to do. Predict the weather, solve an equation, format a document, etc. Computers can do useful things. We program them so that they do those things.

              This is the most ELI5 thing I’ve ever written. If you actually understand programming and you don’t realise that it exists to make computers do things then you’re surely just being deliberately obtuse.

  • Aleko Treko@lemmygrad.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    Glad I bailed out from Microshit when they announced the EoL of Win10. I’d rather be a hobo than to use the un-operating system called Win11.

  • HexesofVexes@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    2
    ·
    2 days ago

    The rust part, if done well, would be a good step.

    Then again, coding in rust is pain, and given how young it is AI is unlikely to manage well with it, and there isn’t the technical ability in rust present to fix what breaks.

    • leftzero@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      The rust part, if done well, would be a good step.

      We’re talking machine translating the whole codebase from c to rust like one would translate a book from one language to another, here (i.e., without the machine actually understanding a single word, much less long groups of them).

      The result will basically be something like this:

      unsafe {
       //insert the entire codebase here,
       //mangled beyond recognition
      }
      
      • reabsorbthelight@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        They should honestly just wrapper the majority of the code base in rust unsafes and then slowly very slowly migrate sections of the code to rust. This is the right way to do it imo

        Will they do that? Nope.

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      15
      arrow-down
      2
      ·
      2 days ago

      Indeed.

      In some ways, this kind of thing is ideal for Rust. It’s at it best when you’ve a good idea of what your data looks like, and you know where it’s coming from and going to, and what you really want is a clean implementation that you know has no mistakes. Reimplementing ‘core code’ that hasn’t changed much in twenty years to get rid of any foolish overflows or use-after-free bugs is perfect for it.

      Using Rust for exploratory coding, or when the requirements keep changing? I think you’ve picked the wrong tool for the job. Invalidate a major assumption and have to rewrite the whole damn thing. And like you say; an important choice for big projects as choosing a tool that a lot of people will be able to use. And Window is very big.

      They’re smoking crack, anyway. A million lines per dev per month? When I’m doing major refactoring, a couple thousand lines per week in the same language, mostly moving existing stuff into a new home, is a substantial change. Three orders of magnitude more with a major language conversion? Get out of here.

    • UnfortunateShort@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 days ago

      I mean, Rust has the massive upsite that it won’t compile in many cases if you fuck things up. Then again, embedded or generally low-level driver-y stuff is still in its infancy in Rust. Relative to C/C++ that is.

      There is stuff that you need that has no official Rust support. There is poor documentation and half baked frameworks. There are examples being silently outdated, breaking changes between framework versions, and nighlty-versions from Github mixed in to fix them. And then of course plenty of timing and hardware dependent things you will need to do yourself.

      I do this for a living and personally tried to use AI here and there to help me out, but oftentimes it fails miserably. Not always, but very often.

      • MashedTech@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        2 days ago

        Rust helps you resolve memory bugs. Not logic bugs. Yeah, it’s going to be new memory safe code… But it won’t be bugfree code.

          • MashedTech@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 day ago

            Oh wait… It’s AI. Just to pass the compiler and the unit tests it will either cast to whatever it desires or just make the tests pass forcefully.

        • The_Decryptor@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          It can help with logic bugs (e.g. by encoding the state machine logic directly in the type system, so an invalid transition won’t compile), and things like data sharing issues (Again, type system, tracks sharable objects vs. those that aren’t), but none of those are as “impervious” as the memory safety stuff.

          But that all still requires rearchitecting, because if the existing code already follows those rules, it already probably doesn’t suffer from those issues (e.g. I know you can do the state machine type stuff in C# at least)

  • Avicenna@programming.dev
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    2 days ago

    this seems like a research project rather than a main line policy change:

    "Just to clarify… Windows is NOT being rewritten in Rust with AI.

    My team’s project is a research project. We are building tech to make migration from language to language possible."

    Ofcourse the end goal remains the same: try to produce systems that can be maintained by a significantly reduced number of programmers/software engineers, which can only work if AGI was actually achieved in near future so that the senior coder gap produced by this approach can be filled by AGI coders. If not, I think we will enter an era of tech where first software engineering will first become highly undesirable (due to reduced number of entry level jobs) and then achieve a god-like status because there are not enough senior software engineers to support all the tech infrastructure build on AI coding agents which are not fully autonomous because they lack/can’t learn critical reasoning skills/software experience unlike entry level coders which can.