• 0 Posts
  • 17 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle
  • I think this might be the right solution for OP, especially since they don’t seem to have had the “experience” I’m sure many of us have had with this “simple” operation. If you are going to do it, them it should be done using a live OS and a full offline system backup, otherwise its very easy to lose a lot of data this way.

    However, symlinks like that can make things confusing really quickly. I would encourage anyone using them in this way to establish some easily checked rules and abide by them. For example, maybe you only use symlinks like this in a specific folder such as /home/expanded. You can still have multiple links there like /home/expanded/on5TBdrive or /home/expanded/onPrimarySSD, but it makes it easier to remember, find, and check those locations later.

    When you need to know exactly where something is stored, verify a backup, or find data without the symlink, then you will appreciate a set of rules that helps you.


  • twack@lemmy.worldtoSelfhosted@lemmy.worldOn the importance of backups
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    10 months ago

    ZFS is fantastic and it can indeed restore files that have been encrypted as long as you have an earlier snapshot.

    However, it would not have helped in this scenario. In fact, it might have actually made recovery efforts much more difficult.

    It could have helped by automatically sending incremental snapshots to a secondary drive, which you could then have restored the original drive from. However, this would have required the foresight to set that up in the first place. This process also would not have been quick; you would need to copy all of the data back just like any other complete drive restoration.






  • I was about to ask why this is better than the docker installation, but I see step one is to install docker haha.

    I’ve been running the docker container for a long time, it works very well. It is a bit more complicated if you try and use extensions that require seperatw containers (like setting up collabora), but that can be done as well. It’s just more complicated.

    I do remember needing to know how to access the internal terminal a few times, but I don’t remember why. If I think of it I’ll come back and add instructions.

    Edit: It’s to be able to run occ commands:

    Sudo docker exec -u www-data nextcloud-app php occ “Command goes here”

    Sudo docker exec -u www-data nextcloud-app php occ files:scan --all



  • I’m not sure if this is helpful to you or not, because it’s not what you asked. I just don’t mount them on boot though.

    I have a script that requires a unique password that decrypts everything that I actually care about. If that hasn’t been run, then the server starts emailing me every 15 minutes until I do.

    The server is not setup to reboot unless I manually tell it to or there is a power outage, so logging in to run the script has never really been an issue. At most, I’ve had to SSH in from my phone maybe a handful of times.








  • This is probably a USPS transfer package and got lost either at Fedex or USPS, which is why the delivery date keeps bouncing around.

    Packages like that are delivered to USPS in bags, and the contents are “logically scanned”. If someone scanned it going into the bag, then you can usually assume that its still in the bag at the other end. However, that’s not always true.

    They might find it later, but just showing up at Fedex is likely to be a waste of your time until that delivery date updates (indicating that someone has found and physically scanned the package).