• 11 Posts
  • 315 Comments
Joined 2 years ago
cake
Cake day: October 20th, 2023

help-circle


  • A lot of people in graphics design et al are contractors. They get hired for a job, do it with their own resources, and then move on. Those folk tend to need to provide their own software.

    Aside from that? Companies DO provide software. But, at least in my experience, early career staff decide they actually NEED matlab or some other super proprietary nonsense and take it upon themselves to get the tools they “need”. Which results in their manager having to have The Talk about why you don’t do that in an actual company and how they are REALLY lucky you are the one that saw them because that is a fireable offense.


  • Let’s say you are a graphics designer. You use Adobe Illustrator and you pirate it. You work for Innertrode either as a contractor or a full time employee. You make their new logo.

    Adobe’s legal team are bored. They see that new logo. They know it was made with Illustrator because of some of the visual quirks/tools (or, you know, because it is anything graphical so of course it uses Adobe). They know that Innertrode doesn’t have a license. So they call up Lumberg and say “what the fuck?”.

    Lumberg then calls the person who was in charge of the new logo and they point at you.

    If you are staff? You were given training not to pirate anything. It is all your fault. Innertrode buys a few years of a license and apologizes and fires your ass and makes sure to tell everyone they know about you. Or you are a contractor and you signed an agreement saying you had valid licenses for everything and they just give your contact info to Adobe and move on.

    And Adobe MIGHT just want to shake you down. Or they might want to make an example and sue the fuck out of some people.

    Also… it is a lot of hearsay for obvious reasons, but there are very strong rumors that some of the more prominent cracks tend to add digital watermarks for the purpose of automating this.


  • There are two layers to this (actually a lot more but)

    What you are describing is mostly supply chain. It is the idea that the package manager’s inventory should be safe. And that is already a nigh impossible task simply because so many of the packages themselves can be compromised. It seems like every other year there is a story of bad actors infiltrating a project either as an attack or as a “research paper”. But the end result is you have core libraries that may be compromised.

    But the other side is what impacted OP and will still be an issue even if said supply chain is somehow 100% vetted. People are inherently going to need things that aren’t in a package manager. Sometimes that is for nefarious reasons and sometimes it is just because the project they are interested in isn’t at the point where it is using a massive build farm to deploy everywhere. Maybe it involves running blind scripts as root (don’t fucking do that… even though we all do at some point) and sometimes it involves questionable code.

    And THAT is a very much unsolved problem no matter what distro. Because, historically, you would run an anti-virus scan on that. How many people even know what solutions there are for linux? And how many have even a single nice thing to say about the ones that do?



  • For a (first) NAS, I generally discourage this.

    Office liquidation desktops are great for home servers (if you aren’t paying for power). But they generally are very limited on storage. Limited bays to install hard drives and limited SATA ports. So you rapidly end up with drives just sitting on the bottom of the case and real jank pcie boards to extend your storage.

    Which then becomes a HUGE issue when you have a drive failure. Because now you need to actually identify which drive is the failed one which involves reading off serial numbers and, depending on the setup/OS, making sure you get the order right when you plug them back in.

    Whereas a 4-bay NAS generally has dedicated hardware and hot swap bays which make this trivial. You might never actually use the hot swap capability, but it makes checking which drive is the bad drive fairly trivial.

    Also, a good 4 bay NAS is REAL easy to unplug and put in the trunk of your car during a disaster. Don’t ask me how I know.



  • NuXCOM_90Percent@lemmy.ziptoSelfhosted@lemmy.worldMini pc for home server?
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    edit-2
    17 days ago

    Raspberry pi: No. Or, at least, not without doing something to make sure you have a real storage backend and aren’t just running it off an SD card. The wear on SD cards is exaggerated and largely minimized if you use an OS that is configured to be aware of it but you are also increasingly relying on a ticking time bomb.

    Mini PC/NUC? I am a huge fan of these and think they are what most people actually need for stuff like home assistant, adguard, etc. Just understand you are going to be storage limited sooner than you expect and you can oversubscribe that CPU and memory a lot faster than you would expect.

    My general suggestion? Install proxmox on the mini PC and deploy on top of that. If/when you decide you want something more, migration is usually pretty easy.

    And if you just want a NAS? It is really hard to go wrong with a 4 bay NAS from one of the reputable vendors (which may just be ugreen at this point?) as those tend to still come out cheaper than building it yourself and 4 disks means you can either play with fire with RAID5 or not be stupid and do RAID1.


  • Presumably most of those services on the same physical host are running in containers? So just add tailscale as a sidecar to that. Each container will be its own host as far as your tailnet is concerned and have its own internal IP. The official tailscale youtube has tutorials on that because it maps much better to a portainer based setup and more or less requires clients to have the tailnet running constantly (which, in my opinion, defeats the purpose of selfhosting but you do you).

    Or do a mess with SRV records and… good luck with that



  • This is one of the big problems with tailscale for home users. For people who only access a system remotely (e.g. a corporate VPN) it is amazing. For people who are both on and off network… yeah.

    What I actually settled on was NOT using one of my domains and to instead just use the tailscale FQDNS in all situations. Mostly because I saw they added more human readable names so it is now like foo.happy-panda.ts.net instead of foo.tb12415161613616161616.ts.net

    • Externally? I just activate the tailscale app and I can see foo.sad-hamster.ts.net with zero additional config. Which is good if I am using an app on my phone or helping someone I trust set up their own machine without needing to drive/fly out there with a laptop.
    • Internally? I actually just added a simple DNS override locally (I use unbound via opnsense for this but you can also do it with a pihole if you really want to). So foo.sad-hamster.ts.net goes to foo.localdomain which goes to a 192.x IP seamlessly

    End result is that I don’t need any special config in any devices or apps and everything just uses the tailscale FQDN regardless of whether it is a “client” connected to the tailscale itself. Which ALSO avoids issues where things stop working during an internet outage.

    I’ve seen alternative setups that specify their own DNS server in their tailnet and… that is a lot of effort if you ask me. Also it seems to be the leading cause of “When I connect to my tailnet I can’t see the outside internet anymore”.


    The big drawbacks to this are that it makes assigning actual certs rather messy since the same FQDN goes to multiple very different IPs… at least one of which being a potential security vulnerability since it is assigned by whoever controls the LAN you are on at any given moment. Not the end of the world and, truth be told, I am less likely to bother with proper certs for fully internal resources (unless I am getting paid to do it). So no NEW risk vectors.

    The other is that you are kind of at the mercy of tailscale corp changing their business model entirely and suddenly having to deal with the fqdn that points to your plex server now actually being used for the latest dating app and everything catching on fire until you remember you did this. But that is a problem that is multiple years down the road…

    Also, depending on what DNS/network shenanigans you do, this could cause other issues. But that is why you always test things yourself.


  • Yeah. There are a few useful websites I end up at that serve similar purposes.

    My usual workflow is that I need to be able to work in an airgapped environment where it is a lot easier to get “my dotfiles” approved than to ask for utility packages like that. Especially since there will inevitably be some jackass who says “You don’t know how to work without google? What are we paying you for?” because they mostly do the same task every day of their life.

    And I do find that writing the cheat sheet myself goes a long way towards me actually learning them so I don’t always need it. But I know that is very much how my brain works (I write probably hundreds of pages of notes a year… I look at maybe two pages a year).


  • One trick that one of my students taught me a decade or so ago is to actually make an alias to list the useful flags.

    Yes, a lot of us think we are smart and set up aliases/functions and have a huge list of them that we never remember or, even worse, ONLY remember. What I noticed her doing was having something like goodman-rsync that would just echo out a list of the most useful flags and what they actually do.

    So nine times out of 10 I just want rsync -azvh --progress ${SRC} ${DEST} but when I am doing something funky and am thinking “I vaguely recall how to do this”? dumbman rsync and I get a quick cheat sheet of what flags I have found REALLY useful in the past or even just explaining what azvh actually does without grepping past all the crap I don’t care about in the man page. And I just keep that in the repo of dotfiles I copy to machines I work on regularly.


  • I would generally argue that rsync is not a backup solution. But it is one of the best transfer/archiving solutions.

    Yes, it is INCREDIBLY powerful and is often 90% of what people actually want/need. But to be an actual backup solution you still need infrastructure around that. Bare minimum is a crontab. But if you are actually backing something up (not just copying it to a local directory) then you need some logging/retry logic on top of that.

    At which point you are building your own borg, as it were. Which, to be clear, is a great thing to do. But… backups are incredibly important and it is very much important to understand what a backup actually needs to be.



  • Homie? I want you to know that while I am going to be inflammatory, I am not insulting you. In a slightly sane world, that should be fine.

    NEVER work with children. “Hey kids. You can go home or you can stay with me and a few others and learn how to use a computer!”. At best you are setting yourself up for some awkward phone calls when Little Jimmy gets caught looking at something his parents don’t approve of.

    If you are a close family friend and the parents understand what you are going to be teaching their kid (and obviously want you to teach it), go for it. If you are just watching them while they eat orange slices? Don’t fucking go anywhere near that. Let the teachers who actually train in how to handle these situations do it.

    And the other aspect: Kids (and most adults) are not rational or intelligent. They aren’t going to take “Hey, if Susie sends you nudes don’t put them on this server because it will get me sent to prison as a diddler” as education on why they should not fucking do that.


    If you ever want to get scared straight as it were? Take a teacher out for drinks (and you better pay for them!). You’ll hear LOTS of horror stories and get even a glimpse into the kind of hell they have to put up with.

    The show Black-ish (like a lot of Kenya Barris’s work) has a LOT of problems. But the number of times teacher friends have shared https://www.youtube.com/watch?v=6jqmj0ILwfM. And it is not at all exclusive to black people (or even men).




  • If you are thinking in terms of building a widget or making an industrial process, it makes perfect sense. Something like a wristwatch is the kind of innovation a LOT of people more or less simultaneously made and it is just impossible to definitively prove what country the first watch was made in. Even figuring out who was the first to file becomes a mess. Same with factory processes where the players who would even have the ability to iterate are often counted on fingers and toes.

    But software (and research) in a global society is a real mother fucker. Because now the entire world can more or less see everything and reproducing things is fairly trivial. And… it isn’t like the patents actually matter all that much when so much gets done overseas. China Don’t Care but also the EU doesn’t really either and so forth. Sure there are avenues to try to pursue a studio using the patented Nemesis System but… at best you are going to be tied up in courts for years trying to get a judge to insist that a company in Germany needs to send you a check.


  • For the exact same reason that “fair use” is still so incredibly nebulous and twitch streaming/let’s plays still exist.

    NOBODY is crazy enough to want to take that to the courts. Because maybe you get a judge who has “common sense”. Maybe you get an old white guy who thinks Pong was too complicated and decides that you are wrong. At which point you have now made a bunch of legal precedent for REALLY stupid stuff.

    Its also kind of why so much stuff about video games actually never gets patented. It is playing with fire.