

AI AI blah blah AI.
Also why is HCL supposedly the 9th most popular “programming language” (which it isn’t anyway)?


AI AI blah blah AI.
Also why is HCL supposedly the 9th most popular “programming language” (which it isn’t anyway)?


There are some examples in the very first list I found googling for “cancel culture examples”.
Not all of them are political (e.g. cancelling someone for sexual assault is clearly not, and that Heineken one… how??), but a decent number are, e.g. number 6 is about as partisan as you can get.


It’s a fairly inevitable reaction to cancel culture. This was predicted and warned against when left-wing cancel culture was at its height, but people didn’t listen. Now we have right-wing cancel culture instead.


I wouldn’t recommend the Gang of Four book. Many of the design patterns they espouse are way over complicated from the days of peak OOP. You know, FactoryFactoryVisitor stuff. Usually best avoided.


Yeah, I use Claude/ChatGPT sometimes for:
I haven’t got around to setting up any of that agentic stuff yet. Based on my experience of the chat stuff I’m a bit skeptical it will be good enough to be useful on anything of the complexity I work on. Find for CRUD apps but it’s not going to understand niche compiler internals or do stuff with WASM runtimes that nobody has ever done before.


He’s right, zstd is incredibly popular, quite widely used and also generally believed to be the best compression algorithm overall.


They use QAM and similar because it’s the best way to transmit data over a small number of long wires. Exactly the opposite of wires inside a CPU.


This video confuses at least three different concepts - quantum uncertainty, ternary computers, and “unknown” values.
Ternary computers are just not as good as binary computers. The way silicon works, it’s always going to be much much slower.
“Unknown” values can be useful - they are common in SystemVerilog for example. But you rarely just have true, false and unknown, so it makes zero sense to bake that into the hardware. Verilog has 4 values - true, false, unknown and disconnected. VHDL has something like 9!
And even then the “unknown” isn’t as great as you might think. It’s basically poor-man’s symbolic execution and is unable to cope with things like let foo = some_unknown_value ? true : true. Yes that does happen and you won’t like the “solution”.
High level programming concepts like option will always map more cleanly onto binary numbers.
Overall, very confused video that is trying to make it sound like there’s some secret forgotten architecture or alternative history when there definitely isn’t.


Yeah I’m watching Ty. Pytype and Pyre are not serious options. Nobody really uses them, and Pytype is discontinued. Facebook have a new project called Pyrefly that’s also worth watching.
But for now, use Pyright. No argument. If you’re really worried about Microsoft (and not Facebook or Google for some reason) then use BasedPyright.


I would say:
Just practice, do projects. Also if you can work on projects with other people because you’ll read a lot of bad code and learn how not to do things (hopefully).
Learn lots of programming languages. They often have different and interesting ways of doing things that can teach you lessons that you can bring to any language. For example Haskell will teach you the benefit of keeping functions pure (and also the costs!).
If you only know Python I would recommend:
Learn Python with type hints. Run Pyright (don’t use mypy; it sucks) on your project and get it to pass.
Go is probably a sensible next step. Very quick to learn but you’ll start to learn about proper static typing, multithreading, build tools (Go has the best tooling too so unfortunately it’s all downhill from here…), and you can easily build native executables that aren’t dog slow.
C++ or Rust. Big step up but these languages (especially C++) will teach you about how computers actually work. Pointers, memory layouts, segfaults (in C++). They also let you write what we’re now calling “foundational software” (formerly “systems software” but that was too vague a term).
Optionally, if you want to go a bit niche, one of the functional programming languages like Haskell or OCaml. I’d probably say OCaml because it’s way easier (it doesn’t force everything to be pure). I don’t really like OCaml so I wouldn’t spend too much time on this but it has lots of interesting ideas.
Final boss is probably a dependently typed language like Lean or Idris. Pretty hardcore and not really of much practical use it you aren’t writing software that Must Not Fail Ever. You’ll learn loads about type systems though.
Also read programming articles on Hacker News.


Clean Code was pretty effectively debunked in this widely shared article from 2020. We probably don’t need to talk about it anymore.
Frankly I’m surprised it was ever recommended. Some of the things it says are so obviously insane, why would anyone think it was good?
My only guess is the title? “Your code sucks; maybe read this book that I haven’t vetted about clean code.” sort of thing?
I’d say it would be good to have a modern replacement with good advice to recommend… But in my experience you can’t really learn these things by reading about them. You have to experience it (and have good natural taste).
This list of code smells is pretty decent at least: https://luzkan.github.io/smells/
I use Google Slides - you get better control over the layout.


All three of those languages have library ecosystems at least as good as Python’s. Typescript is just as easy to learn and as fast to write as Python. I don’t see why you’d think Python is faster. If I add up all the time I’ve lost to Python’s terrible tooling it’s quite a lot slower!
Rust is definitely harder to learn - I’ll give you that. But once you have learnt it it’s just as fast as Typescript and Python. Especially if your “fast to write” metric measures to when you program is correct.


I think Python is superficially easier since you don’t have to declare variables, printing is a little easier, etc. And in some ways it is actually easier, e.g. arbitrary precision integers, no undefined, less implicit type coercion.
But I agree JavaScript is generally a better choice. And it is actually more popular than Python so…


I think it’s just because it is always recommended as an “easy” language that’s good for beginners.
The only other thing it has going for it is that it has a REPL (and even that was shit until very recently), which I think is why it became popular for research.
It doesn’t have anything else going for it really.
uv is a lifesaver there but even with uv it’s a bit of a mess.The actual syntax is not too bad really, but everything around it is.


I mean… C is a low bar. You can write Typescript, Rust and Go code 5x faster than C too.


pip is easily the worst thing about Python. But now that we have uv I would say the worst thing is the package/import system. I’m pretty sure only 1% of developers understand it, and it only really works properly if your Python code is a Python package.
If you treat Python as a scripting language and just scatter loose files around your project and run them directly, it doesn’t work at all. Pain everywhere. Which is dumb as fuck because that’s like 80% of how people use Python.


Very easy to install
This has to be a joke.


Indeed, but there’s no need to shit on people using floats because in almost all cases they are fine too.
Thanks for highlighting your username - made me notice that you post a lot of nonsense here so I can easily block it!