• dwindling7373@feddit.it
    link
    fedilink
    arrow-up
    85
    arrow-down
    2
    ·
    1 year ago

    Noob answer? No, because the other party will likely store them in unsafe manner and send it through Facebook Messenger to that Aunt of theirs.

  • Björn Tantau@swg-empire.de
    link
    fedilink
    arrow-up
    62
    arrow-down
    2
    ·
    1 year ago

    As far as anyone knows WhatsApp uses secure end to end encryption. So only your device and the other person’s device has access to the picture.

    The only downside is that WhatsApp is a closed source program, so it isn’t verifiable that the encryption is correctly implemented.

    • Hagarashi8@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      Also, even now, your message could be thousand times encrypted - Google drive backups are not. At least by default. Don’t know anything about iOS, but probably same.

  • StrawberryPigtails@lemmy.sdf.org
    link
    fedilink
    arrow-up
    67
    arrow-down
    10
    ·
    1 year ago

    First off, it sounds like congratulations are in order! A new life is always cause for celebration! I hope you, your spouse and your new child are doing well.

    Short answer to your question: NO! DO NOT SEND ANY SENSITIVE DATA (INCLUDING PHOTOS) VIA ANY PATH, OR SERVICE YOU DO NOT FULLY CONTROL!!!

    Long answer: While What’sApp, Meta and the like, are not known to be quite as… proactive as Google in cracking down on child pornography there is the very real risk that any data you send via any service may be scanned via a ML algorithm and flagged. What happens next depends on the particular service. Not sure about WhatsApp, but in the case of Google, once your account is flagged, your entire account is forwarded to Law Enforcement. As you are just sending pictures of your new arrival (Congrats again!), odds are that the officer assigned will take one look at it and clear you. All good, so far, right? Yea, not so much. You might not be going to jail but when Google locks down an account, they do not reactivate it, regardless of what law enforcement might decide, and as they are a private company, suing them to get your accounts reactivated is a lost cause. They are allowed to decide whom they want as a customer so long as their standard is applied evenly and doesn’t target certain protected groups.

    No service you use should ever be allowed to see anything important to you. Ever.

    If you can, I would self host a cloud service like NextCloud out of your own home to share files freely, although an GPG encrypted email would work. Your current email provider is fine, although use a third party email client that supports encryption, like Thunderbird. and much as I like ProtonMail’s stance on privacy, I would still use a separate encryption method for anything truly sensitive.

    I know I sound like a privacy nutjob, but seriously. When the consequences of a false allegation are that high, you should recognize the threat and act accordingly. I use Google, TikTok, iCloud and others, but if the subject matter is anything much more consequential than the weather, then it doesn’t touch their servers. It’s not so much paranoia as it is threat mitigation. Google and Apple’s services are incredibly useful, but if you depend on them too much, the loss of them could hurt, alot.

    Like I said most of the other services don’t have quite the reputation for uncalled for lockouts but here are a few news articles I came up with on a quick search:

    If your interested in learning more about self-hosting services out of you home you might check these out as a starting point:

    • vext01@lemmy.sdf.org
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Yeah, I hear ya, but good luck getting your whole family, including elderly relatives, to use something you do control…

      It’s a losing battle. You can share pics by a secure channel and they’ll just repost it all over the place anyway.

      • StrawberryPigtails@lemmy.sdf.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        but good luck getting your whole family, including elderly relatives, to use something you do control…

        That’s one of the things I like about NextCloud. Even the most non-technical person knows how to follow a link in a email (to many an IT tech’s lament). All I do is share the file in NextCloud, maybe password protect it with a simple password, and copy the share link to the email or text message. Bob’s your uncle. Grandma Nosy-Britches gets to see the files but her email or messaging provider (Google, Mircrosoft, or whoever) does not…. At least until she shares the file directly. More likely though she will share the link. But that’s probably not something I’m too concerned about.

        You can share pics by a secure channel and they’ll just repost it all over the place anyway.

        “A secret shared is no secret.” If you have a problem with something being shared, don’t share it. With anyone.

        I’m more concerned about accidentally tripping safety or security systems I don’t fully understand and having something I depend on suddenly cut off than I am the vagaries of dear Aunt Noisy. I can pretty well guess what Aunt Noisy or Grandma Nosy-Britches will do and it’s either not a problem or I’ve taken steps to avoid any problems.

    • CanadaPlus@futurology.today
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      I’d add the caveat that a lot of the common options are even worse. There’s at least encryption most of the time, in standard app operation.

      Via Element or Signal would be the best answer (or was last I checked), if there’s was anybody else on there.

  • Elise@beehaw.org
    link
    fedilink
    arrow-up
    30
    ·
    1 year ago

    I feel out of the loop here. What’s so secretive about a photo of a newborn?

    • JCreazy@midwest.social
      link
      fedilink
      English
      arrow-up
      22
      ·
      1 year ago

      Some people just don’t want pictures of their kids all on the internet. It could be seen as borderline paranoia by some people but I think everyone has the right to the level of privacy that they want.

      • IninewCrow@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        The best form of taking secure photos of your family is to take photos with a dedicated camera with it’s own memory card. Develop the photo into a hard copy and keep the image digitally stored in your own systems and never share it online.

    • Neshura@bookwormstory.social
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Also CSAM detection algorithms are known to misfire on occasion (it’s hard to impossible to tell apart a picture of a naked child sent for porn purposes and one not send for that) and people want to avoid any false allegations of that if at all possible.

    • grandel@lemmy.ml
      link
      fedilink
      arrow-up
      9
      arrow-down
      3
      ·
      1 year ago

      Because the child might not want pictures of themselves on the internet. Its a right to privacy thing, at least here in the EU.

    • Xtallll@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Yeah, babies are cute and all, but they all look the same for the first few weeks. If Everyone just had a standard library of like 20 or so pictures of newborn babies and everyone just picked one and shaired it instead of pictures of their kid no one would notice.

    • LukeSky@lemmy.mlOP
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      1 year ago

      By safe I mean about privacy, if there’s possibility that someone can “intercept” the photos of the child. Sorry if I didn’t explain it well

      • rufus@discuss.tchncs.de
        link
        fedilink
        arrow-up
        18
        arrow-down
        1
        ·
        edit-2
        1 year ago

        In computer security it always depends on your thread model. WhatsApp is supposed to be end-to-end-encrypted, so nobody can intercept your messages. However: Once someone flags a message as inappropriate, this gets circumvented and messages get forwarded to Meta. This is only supposed to happen if it’s flagged. So unlikely in a family group. I trust this actually works the way Meta tells us, though I can’t be sure because I haven’t dissected the app and this may change in the future. And there is lawful intercept.

        Mind that people can download or screenshot messages and forward them or do whatever they like with the pictures.

        And another thing: If you have Sync enabled, Google Photos will sync pictures you take with their cloud servers and it’ll end up there. And Apple does the same with their iCloud. As far as I know both platforms automatically scan pictures to help fight crime and child exploitation. We aren’t allowed to know how those algorithms work in detail. I doubt a toddler in clothes or wrapped in a blanket will trigger the automatism. They claim a ‘high level of accuracy’. But people generally advise not to take pictures of children without clothes with a smartphone. Bad incidents have already happened.

        Edit: Apple seems to have pushed for cloud scanning initially, but currently that doesn’t happen any more. They have some on device filters as far as I understand.

        • kirklennon@kbin.social
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          As far as I know both platforms automatically scan pictures to help fight crime and child exploitation.

          Apple doesn’t. They should but they don’t. They came up with a really clever system that would do the actual scanning on your device immediately before uploading to iCloud, so their servers would never need to analyze your photos, but people went insane after they announced the plan.

          • rufus@discuss.tchncs.de
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            Oh. I didn’t know that. I don’t use Apple products and just read the news, I must have missed how the story turned out, so thanks for the info.

            Technically I suppose it doesn’t make a huge difference. It still gets scanned by Apple software. And sent to them if it’s deemed conspicuous. And the algorithm on a device is probably limited by processing power and energy budget. So it might even be less accurate. But this is just my speculation. I think all of that is more of a marketing stunt. This way the provider reduces cost, they don’t need additional servers to filter the messages and in the end it doesn’t really matter where exactly the content is processed if it’s a continuous chain like in the Apple ecosystem.

            The last story I linked about the dad being incriminated for sending the doctor a picture would play out the same way, regardless.

            Edit: I googled it and it seems the story with Apple has changed multiple times. The last article I read says they don’t even do on-device scanning. Just a ‘nude filter’. Whatever that is. I’m cautious around cloud services anyways. And all of that might change and also affect old pictures. We just avoided mandatory content filtering in the EU and upload filters and things like that are debated regularly. Also the US has updated their laws regarding internet crime and prevention of child exploitation in the last years. I’m generally unsure where we’re headed with this.

            • kirklennon@kbin.social
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              The proposal was only for photos stored on iCloud. Apple has a legitimate interest in not wanting to actually host abuse material on their servers. The plan was also calibrated for one in one trillion false positives (it would require multiple matches before an account could be flagged), followed by a manual review by an employee before reporting to authorities. It was so very carefully designed.

              • rufus@discuss.tchncs.de
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                1 year ago

                Do you happen to know a good source for information on this? I don’t want to highjack this discission, since it’s not that closely related to the original subject… But I’d be interested in more technical information. Most news articles seem to be a bit biased and I get it, both privacy and protection of children are sensible topics and there are feelings envolved.

                One in a trillion sounds like a probability of a hash collision. So that would be just checking if they already have the specific image in their database. It’ll trigger if someone downloaded an already existing image and not detect new images taken with a camera. I’m somewhat fine with that.

                And I was under the impression that iPhones connected to the iCloud sync the pictures per default? So “only for photos stored on iCloud” would practically mean every image you take, unless you deliberately changed the settings on your iPhone?

                • kirklennon@kbin.social
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 year ago

                  Do you happen to know a good source for information on this?

                  Apple released detailed whitepapers and information about it when originally proposed but they shelved it so I don’t think they’re still readily available.

                  One in a trillion sounds like a probability of a hash collision.

                  Basically yes, but they’re assuming a much greater likelihood of a single hash collision. The system would upload a receipt of the on-device scan along with each photo. A threshold number of matches would be set to achieve the one in a trillion confidence level. I believe the initial estimate was roughly 30 images. In other words, you’d need to be uploading literally dozens of CSAM images for your account to get flagged. And these accompanying receipts use advanced cryptography so it’s not like they’re seeing “oh this account has 5 potential matches and this one has 10”; anything below the threshold would have zero flags. Only when enough “bad” receipts showed up for the same account would they collectively flag it.

                  And I was under the impression that iPhones connected to the iCloud sync the pictures per default?

                  This is for people who use iCloud Photo Library, which you have to turn on.

      • shadowintheday2@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        edit-2
        1 year ago

        Interception by a third party is highly unlike, as the transport layer of basically everything is encrypted nowadays. What is left unknown is what can Meta do once the file is on their servers, as you’ll have to trust Zuckk’s word and Zuckk’s encryption

          • Luffy879@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            If Meta would really not know your Messages and encryption Keys, they would not be able to recover Every single one oft your messages even if you forgot your Password.

            • Carighan Maconar@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              Last time I needed that they could not. They needed either the backup (which is less secure and private but your choice whether to use or not, I think it uploads to Google Drive or so?) Or another device that is still working that is linked to the same account.

        • Lmaydev@programming.dev
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          It’s end to end encrypted so they can’t see it then. What they could do is access it once it’s on your device and unencrypted potentially.

          • Hagarashi8@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Or through unencrypted by default backup. It goes on Google drive and there’s no guarantee that it doesn’t go to Meta.

      • LoveSausage@lemmy.ml
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        No , but they will get the meta data. But image should be secure. But then your recepient download it , upload it to Google cloud and so on

      • Big P@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        If someone is able to intercept WhatsApp messages, they aren’t using it to look at photos of your baby they’re using it to spy on government officials

      • atro_city@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        you have to trust that Meta doesn’t do anything with your pictures before they are sent and that the person you’re sending them to doesn’t backup their whatsapp stuff to google.

        It’s more secure to use Signal

  • Beardedsausag3@kbin.social
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    1 year ago

    You could hand deliver them in a sealed envelope but it won’t stop the recipient scanning them then sharing them on messenger, texts etc.

    You’d need to consider where and how they get shared beyond the person you send them to, to then decide which level of privacy is appropriate. Ultimately, even though others don’t recommend WhatsApp (nor would I) - it’s maybe the best option in this case. Accessibility, ease of sharing just no guarantees on the encryption because the source is behind closed doors.

  • tweeks@feddit.nl
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    1 year ago

    It depends on if you trust Meta. Generally speaking there is end-to-end encryption in WhatsApp, which means only you and the person you chat with can decrypt your messages / media (source). I believe there are some weak spots in group chats, mostly caused by users themselves. Not sure about the new Community function but I’d be careful with what I share there.

    Some parties like Apple have decided to scan photos from your device for illegal material (edit: after backlash they dropped this for now, my bad). If using an app like WhatsApp I’d personally be aware that something like that might happen in the future as well. I’d not be surprised if some employees might (temporarily) be able to access more data than widely assumed, for debugging reasons in case of bugs.

    Personally I take the risk for pragmatic reasons, but it doesn’t hurt to be a bit cautious / aware.

    • Neshura@bookwormstory.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      iirc Microsoft is doing it, read of a case where a parent sent a picture of his son to the doctor via onedrove share and his entire account got suspended over it.

    • kirklennon@kbin.social
      link
      fedilink
      arrow-up
      1
      arrow-down
      6
      ·
      1 year ago

      Some parties like Apple have decided to scan photos from your device for illegal material.

      No they haven’t, they aren’t, and they never even discussed scanning your messages like that. There’s a communication safety feature available to enable in parental controls so that if a child’s phone locally recognizes (using machine learning) that they received or are about to send a nude photo, the receiving photo is blurred and they’re given information about making safe choices and then allowed to continue or not.

      • sylver_dragon@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        No they haven’t, they aren’t, and they never even discussed scanning your messages like that.

        They discussed it (source) but the backlash was enough to kill the project for now. Instead, they implemented the “opt-in” system you are talking about.

        • kirklennon@kbin.social
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          They discussed something adjacent, not anything that would scan and disclose your encrypted messages.

      • tweeks@feddit.nl
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Thanks for correcting me, you are right about the image scanning. Added an edit to my statement.

  • Dyskolos@lemmy.zip
    link
    fedilink
    arrow-up
    15
    arrow-down
    7
    ·
    edit-2
    1 year ago

    No. Never use a messenger for those things, that is legally allowed in most nations. They advertise e2e-encryption and stuff, but also need to comply to governments.

    Remember e. G. The reason telegram’s owner was kicked out of his own country because he didn’t comply to leaving a backdoor for the gov? And how it’s, in some nations, one of the only few messenger left that can be used to express a free opinion without “dissapearing” after?

    With your pics you’ll train AI-models for free at best.

    I would never ever share personal stuff over some mega-corpo’s “free” thing.

    • Luffy879@lemmy.ml
      link
      fedilink
      arrow-up
      14
      ·
      edit-2
      1 year ago

      but also need to comply to governments.

      I dont know about every country, but as far i know in germany at least you only need to disclose the information you actually have. Eg: if you dont have the encryption keys, the government cant do shit with the encrypted messages they habe

      So, correct me if im not right, but a messenger could be privacy respecting and legal at the same time

      • Dyskolos@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Well, telegram e. G. Still is problematic in germany due to not complying. We in germany might be relatively safe for now, that’s true. But don’t forget what’s at dawn for us. Then tgram & co will be banned and whatscrap will thrieve even more.

        I might come across as bitter, but that’s only because i am 😁

  • geoma@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Signal with a view once message is better than a lot of the available options. Also maybe threema, simplex or session but Signal is more popular nowadays.

  • TexMexBazooka@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Safe?

    Yeah, safe in the sense that it would be difficult for someone to intercept and steal/alter. That doesn’t prevent whomever your sending it to saving it an insecure manner.

    But there’s one REALLY important variable here/ nobody wants to steal a picture of your kid. Nobody cares.