My thinking was that bees are the ones that pollinate, so male.
But they certainly don’t pollinate birds, so I don’t know where that was going either. 🫠
My thinking was that bees are the ones that pollinate, so male.
But they certainly don’t pollinate birds, so I don’t know where that was going either. 🫠


I feel like this isn’t really a new development. Back when LAN parties and local multiplayer were still a thing, games like TeeWorlds, Worms etc. were popular, because they ran on potatoes and you could often get them for free.
The actual fun then came from dicking around with or competing against your friends. The game itself does not need to be ground-breaking for that.
Hell, it technically started even earlier than that, with physical card games and board games and such. Just play them with friends and it’s fun.


I have caught myself genuinely thinking that management needs to unearth more budget, if they so desperately want us to use these AI tools, so that we can onboard another person to compensate for the productivity hit.
Unfortunately, they believe the opposite to be true, that we just need to use AI tools and then our productivity will be through the roof…


I have not looked into these myself yet, but Apertus is supposed to be fully open: https://programming.dev/post/36791696
And I recently heard of StarCoder, which was also said to be like that and which is optimized for coding assistance: https://github.com/bigcode-project/starcoder


I also always find that outsourcing is risky, whether it’s to other devs or to some AI, because it requires that you understand the problem in whole upfront. In 99% of cases, when I’m implementing something myself, I will run into some edge case I had not considered before and where an important decision has to be made. And well, a junior or LLM is unlikely to see all these edge cases and to make larger decisions, that might affect the whole codebase.
I can try to spend more time upfront to come up with all these corner cases without starting on the implementation, but that quickly stops being economic, because it takes me more time than when I can look at the code.
Yeah, but still not in agreement with the people who live/d there…
You know how the US likes to obliterate countries for oil? Yeah, our power companies like to do that for coal.


I mean, I don’t have a ton of skin in the game here, as I don’t care much for horror games either way.
But yeah, I just assume that they say they’re cautious to calm the fans, but they actually can’t be cautious, since well, they can only really delay by a whole year at a time, and if they do that, then they have two games in the year afterwards.
They did only pre-plan a handful of years, so maybe they can just delay the following games by a year each, too.
But yeah, it still just sounds like the decision-making here isn’t driven by logic or what allows publishing good games, but rather by
![]()


Oh man, and they’re gonna want to release in autumn, too, to be in time for spooky season. So, if it isn’t done at that point, they’re likely to release in an unfinished state rather than delay by a whole year…


I mean, for me, it’s also mostly a matter of us doing embedded(-adjacent) software dev. So far, my company would hardly ever choose one stack over another for performance/efficiency reasons. But yeah, maybe that is going to change in the future.


Large shared codebases never reflect a single design, but are always in some intermediate state between different software designs. How the codebase will hang together after an individual change is thus way more important than what ideal “north star” you’re driving towards.
Yeah, learned this the hard way. Came up with an architecture to strive for 1½ years ago. We shipped the last remaining refactorings two weeks ago. It has been a ride. Mostly a ride of perpetually being low-priority, because refactorings always are.
In retrospect, it would’ve likely been better to go for a half-assed architecture that requires less of a diff, while still enabling us to ship similar features. It’s not like the new architecture is a flawless fit either, after 1½ years of evolving requirements.
And ultimately, architecture needs to serve the team. What does not serve the team is 1½ years of architectural limbo.


I mean, don’t get me wrong, I also find startup time important, particularly with CLIs. But high memory usage slows down your application in other ways, too (not just other applications on the system). You will have more L1, L2 etc. cache misses. And the OS is more likely to page/swap out more of your memory onto the hard drive.
Of course, I don’t either sit in front of an application and can tell that it was a non-local NUMA memory access that caused a particular slowness, so I can understand not really being able to care for iterative improvements. But yeah, that is also why I quite like using an efficient stack outright. It just makes computers feel as fast as they should be, without me having to worry about it.
I heavily considered ending this comment with this dumbass meme:

Then I realized, I’m responding to someone called “Caveman”. Might’ve been subconscious influence there. 😅


I don’t know what part of that is supposed to be an insult.
And the article may have talked of such stark differences, but I didn’t. I’m just saying that the resource usage is noticeably lower.


Yeah, you need to do tree-shaking with JavaScript to get rid of unused library code: https://developer.mozilla.org/en-US/docs/Glossary/Tree_shaking
I would expect larger corporate projects to do so. It is something that one needs to know about and configure, but if one senior webdev works on a project, they’ll set it up pretty quickly.


This isn’t Reddit. You don’t need to talk in absolutes.
Similar to WittyShizard, my experience is very different. Said Rust application uses 1200 dependencies and I think around 50 MB RAM. We had a Kotlin application beforehand, which used around 300 dependencies and 1 GB RAM, I believe. I would expect a JavaScript application of similar complexity to use a similar amount or more RAM.
And more efficient languages do have an effect on RAM usage, for example:
.iter() + .collect().

Yeah, gonna be interesting. Software companies working on consumer software often don’t need to care, because:
I can see somewhat of a shift happening for software that companies develop for themselves, though. At $DAYJOB, we have an application written in Rust and you can practically see the dollar signs lighting up in the eyes of management when you tell them “just get the cheapest device to run it on” and “it’s hardly going to incur cloud hosting costs”.
Obviously this alone rarely leads to management deciding to rewrite an application/service in a more efficient language, but it certainly makes them more open to devs wanting to use these languages. Well, and who knows what happens, if the prices for Raspberry Pis and cloud hosting and such end up skyrocketing similarly.
The problem is that it sounds like a riddle. In a riddle, you’re traditionally supposed to work within the rules that you’ve been told. So, not thinking outside the box here is not an indication that the person isn’t capable of doing so.
Of course, if I encountered this problem in real life, I’d ask Carol from accounting to check the other room, while I flip the switches. But my instinctive answer was that it is not possible, because I assumed it to be a riddle and the provided rules did not allow a solution.



I almost expected someone to learn that just from me posting. 😅
Basically, OpenOffice used to be organized by Sun Microsystems. Then Sun got bought by Oracle back in 2010.
Oracle does not have a good reputation at all, so the OpenOffice devs from back then figured they’d need to take things into their own hands and set up The Document Foundation to organize further development. But the OpenOffice trademark was owned by Sun/Oracle, so they had to rename and get a new homepage and everything. The name they chose is LibreOffice: https://www.libreoffice.org/
After the OpenOffice project was effectively dead, Oracle handed it and its trademark over to the Apache Foundation, where it’s seeing occasional bug fixes. But to my knowledge, they don’t even have the capacity to fix all the security problems.
All the actual feature development happens over on the LibreOffice side.
So, in practice, if you want OpenOffice, what you really want is LibreOffice.


Yeah, not great. You always hope that projects under a larger foundation, like GNOME, have a higher bus factor¹, but unless that foundation has dispensible income to pay someone, you’re ultimately still reliant on volunteers and not many people volunteer for maintenance.
What the foundation can do, though, which is also really important, is to hand over the keys to a new maintainer, should you disappear over night.
Like, yeah, forking is great, but some people will never learn of the fork. It happens about once a year that I find someone online who’s still using OpenOffice and that project has been practically dead since 2011.
So, I do hope we can get more open-source projects under some sort of umbrella. No idea how to actually do that, though. I also have open-source projects where I would not even know where to start to get them under some organization…
This post made me realize, I’ve only ever heard “the birds and the bees” referenced, but never actually how it’s applied during sex ed.
But uh, turns out this does not make any sense in that context either. It’s just two separate examples to explain sexuality, so bees pollinating flowers and birds laying eggs. They’re just used as examples, because they’re visible in nature and somewhat resemble the mechanics of sex.
https://en.wikipedia.org/wiki/The_birds_and_the_bees