

If you build for a containerised environment, standing up your service in Kubernetes with HPA gives you all the scalability (and potentially cost) benefits of serverless without all the drawbacks.
Canadian software engineer living in Europe.


If you build for a containerised environment, standing up your service in Kubernetes with HPA gives you all the scalability (and potentially cost) benefits of serverless without all the drawbacks.
“Oh hi! Here’s some code. I didn’t write it and don’t understand it, but you should totally run it on your machine.”


Oh noes! Whatever will we do if we can’t keep building ridiculous emotional support trucks like land rovers??
I fucking haaaaaaaate cars, but I’ll take a fleet of smaller, cheaper EVs over the filthy, dangerous, antiquated, artificially propped-up monstrosities we’re dealing with now.
The bit of information you’re missing is that du aggregates the size of all subfolders, so when you say du /, you’re saying: “how much stuff is in / and everything under it?”
If you’re sticking with du, then you’ll need to traverse your folders, working downward until you find the culprit folder:
$ du /*
(Note which folder looks the biggest)
$ du /home/*
(If /home looks the biggest)
… and so on.
The trouble with this method however is that * won’t include folders with a . in front, which is often the culprit: .cache, .local/share, etc. For that, you can do:
$ du /home/.*
Which should do the job I think.
If you’ve got a GUI though, things get a lot easier 'cause you have access to GNOME Disk Usage Analyzer which will draw you a fancy tree graph of your filesystem state all the way down to the smallest folder. It’s pretty handy.


Plus the FF extension is really full-featured. I can clip in different formats or even take a screenshot if the webpage makes clipping hard.
I didn’t even know there was a Firefox extension! I might give it a look.
I was a Windows user as a kid in the 80s & 90s doing pirate installs of 3.11 and later 95 for friends and family. I got into “computers” early and was pretty dedicated to the “Windows is the best!” camp from a young age. I had a friend who was a dedicated Mac user though, and she was bringing me around. The idea of a more-stable, virus-free desktop experience was pretty compelling.
That all changed when I went to school and had access to a proper “Mac lab” though. Those motherfuckers crashed multiple times an hour, and took the whole OS with them when they did it. What really got to me though was the little “DAAAAAAAAAAA!” noise it would make when you had to hard reboot it. It was as if it was celebrating its inadequacy and expected you to participate… every time it fucked you over and erased your work.
So yeah, Macs were out.
I hadn’t even heard of Linux in 2000 when I first discovered the GPL, which (for some reason) I conflated with GNOME. I guess I thought that GNOME was a new OS based on what I could only describe as communist licensing. I loved the idea, but was intimidated by the “ix” in the name. “Ix” meant “Unix” to me, and Unix was using Pine to check email, so not a real computer as far as I was concerned.
It wasn’t until 2000 that I joined a video game company called “Moshpit Entertainment” that I tried it. You see, the CEO, CTO, and majority of tech people at Moshpit were huge Linux nerds and they indoctrinated me into their cult. I started with SuSe (their favourite), then RedHat, then used Gentoo for 10 years before switching to Arch for another 10+.
TL;DR: Anticapitalism and FOSS cultists lead me into the light.
What exactly is an external drive case? Are you just talking about a USB enclosure for a single drive or something that can somehow hold multiple drives and interface over something more stable than USB?


Joplin will do this for you. It comes ready to sync with all sorts of cloud options, as well as “local folder” which works well with Syncthing. It’s offline-first, cross-platform, and FOSS.


It’s the expectation.
Boris and Truss were abject morons and Sunak was an insulated, rich Tory. They were expected to be terrible and so we weren’t surprised.
Starmer won in a landslide victory for Labour and went about screwing the poor, arresting old ladies, and presiding over genocide. Conservatives hate him because he’s on the Red Team, and the Left hate him because he acts like a Tory.
If he’d run as a Tory, he’d be scoring higher than everyone since Cameron, but he was supposed to fix the mess, not make it worse.


The opposition to id cards themeselves in this country is very strange to me. Most civilised countries have some sort of national identification number/card that can be used to access government services, and having worked inside the UK government, I can tell you what a total nightmare it is to develop services for Britons without such a unique id.
What kills me, is that inevitably the id card debate here seems to focus in on the existence of a card rather than what the government wants to attach include with it, like biometric data, or pairing it with an app with invasive permissions. You need an id number for me so that I can be identified when accessing government services, you don’t need to keep a record of every time I boarded a train or more surveillance nonsense.
This country is so used to government surveillance that they automatically assume that “id card means more tracking” rather than objecting to the tracking that already exists (have you seen Oxford Street?) and opposing an id that’d save the country mountains of cash and hassle if used properly.


Fuck yes. More of this please.


…or contribute to Mozilla’s work while getting something in return.
That’s exactly the reasoning Google has followed with its development and promotion of webp. Unfortunately, whether the website cares or not, CO₂ emissions are markedly higher due to increased client energy consumption, and that does directly affect you, so it’s worth considering the implications of using webp in a popular site.
Webp is pretty great actually. Supporting a 32bit alpha channel means I’ve actually managed to reduce file sizes of what were formerly PNGs by something like 80%, which drastically improved performance (and the size of my project). I don’t get where the complaint of image quality came from either, as it seems to perform better than JPEG at the same file size.
The worst part is that you missed the real problem with the format: the CPU overhead (and therefore the energy cost) of handling the file. A high-traffic site can dramatically increase the energy required for the images processed by the thousands/millions of clients in a single day, which places a drain on the grid and bumps up CO₂ (yes, this is really a thing that people measure now).
Basically Google invented the format to externalise their costs. Now, rather than footing the bill for bigger datacentres and greater bandwidth, they made everyone else pay for decompression.


Serious question: could we not just fork the project under the GPL and use that?


I downvoted this, largely because while you may have identified the photo correctly the way you’ve done so here makes it sound like Berry has made a mistake when there’s no evidence of that from the article.
From her X post (why the fuck are these people still on X?):
Confirmed by staff, and removed swiftly by event organisers.


Fuck that. Roll out the red carpet for him and then chuck him in prison.
“Oh no! Not Sandringham!” - Andrew, probably.