A long long time ago, I bought a domain or two, and a shared hosting plan from Dreamhost w/ unlimited bandwidth/storage. I don’t have root access, and can’t do containers on this. It’s been useful for a Piwigo instance to share scanned family photos. The problem I have is the limited resources really limit Piwigo’s ability to handle the large TIF files involved in the archival scans. There are ways around this, but they all add time to the workflow that already eats into my free time enough. I’m looking at moving Piwigo to my local server that has plenty of available resources. That leaves me with little reason to keep the Dreamhost space. So what’s a decent use case for cheap, shared hosting space anymore?

To be clear, I’m not looking for suggestions to move to a cheap VPS. I’ve looked into them, and might use one in the future, but don’t need it right now. The shared hosting costs about $10.99/month at the moment. If there was a way I could leverage the unlimited bandwidth/storage as an offsite backup, that would be amazing, but I’m not sure it would be a great idea backing up stuff to a webserver where there best security I can add it via an .htaccess file.

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    Yes, through Namecheap. Right now it’s just hosting my personal site on WordPress, but I’m going to switch that soon due to Matt Mullenweg’s drama or just take it down entirely.

    • dugmeup@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      2 months ago

      I read the details. He has a decent reason to ask WP to pay as they got brought by private equity and significantly dropped their contributions to the WordPress which is open source and hence relies on the community to feed back into the product.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        2 months ago

        Ask, sure. Sue, maybe. Commandeer extensions, absolutely not.

        If they don’t like people using their open source project, they shouldn’t offer that license.

  • poVoq@slrpnk.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 months ago

    You can always encrypt the backups you upload there.

    Depending on the specs of the shared webspace it is possible to install some php based webdav software to easily sync files with it. KaraDAV for example.

  • Passerby6497@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    If there was a way I could leverage the unlimited bandwidth/storage as an offsite backup, that would be amazing, but I’m not sure it would be a great idea backing up stuff to a webserver where there best security I can add it via an .htaccess file.

    Your off-site backup solution shouldn’t have to care about that level of security because you should be encrypting your backups before they leave your network. Even if you have a solid backup host in the cloud, you still want to encrypt your backup data before you send it to their hosted repo.

    Unless your vendor has a reason to read your backups, they shouldn’t be able to.

    • d00phy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Fair point. Currently, everything that requires off-site backup is sent to my father’s Synology using hyperbackup. So off-site is sorta self-hosted already. Was thinking in terms of a second fallback option.

  • lorentz@feddit.it
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    I did some experiments in the past. The nicer option I could find was enabling webdav API on the hosting side (it was an option on cPanel if I recall correctly, but there are likely other ways to do it). These allow using the webserver as a remote read/write filesystem. After you can use rclone to transfer files, the nice part is that rclone supports client side encryption so you don’t have to worry too much about other people accessing files.

  • Atherel@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Oh and regarding the large TIF files, what limits are you hitting? Most hosters allow to change the php settings like memory limit or max execution time.

    • d00phy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I haven’t played w/ memory limits, but when I tried messing w/ buld download of raw TIF files, it ran out of memory pretty quick. I may look into what I can to about the limits, though.

  • Strit@lemmy.linuxuserspace.show
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    I technically still have a hosted website, but it’s rarely updated anymore. It’s very low priority compared to my self-hosted stuff.

    • d00phy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Same. I have a mediawiki install on the shared hosting still, but I haven’t updated it in forever. For the $10.99/month I’m paying for shared hosting, I could save a little and do a more powerful VPS to host similiar stuff… Of just keep doing what I’m doing w/ my S12 pro & Synology. Might look at some kind of failover down the road.

  • Atherel@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    As others said, encrypt your backup before sending it to your server. And can’t you upload the files to a folder outside of a document root or better outside of the www folder so there is no way to access it through the web server service?

    • d00phy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I can upload files outside of the docroot, but if they stay there for too long, I get a nasty email from Dreamhost reminding me that this is for web space and not offsite storage (something they also sell). I haven’t tried uploading something inside the docroot and just setting permissions to 400 or something!