• potatopotato@sh.itjust.works
      link
      fedilink
      arrow-up
      21
      arrow-down
      2
      ·
      5 months ago

      Intrinsically/semantically no but the expectation is that the texts are encrypted at rest and the keys are password and/or tpm+biometric protected. That’s just how this works at this point. Also that’s the government standard for literally everything from handheld devices to satellites (yes, actually).

      At this point one of the most likely threat vectors is someone just taking your shit. Things like border crossings, rubber stamped search warrants, cops raid your house because your roommate pissed them off, protests, needing to go home from work near a protest, on and on.

      • 9tr6gyp3@lemmy.world
        link
        fedilink
        arrow-up
        23
        arrow-down
        11
        ·
        edit-2
        5 months ago

        If your device is turned on and you are logged in, your data is no longer at rest.

        Signal data will be encrypted if your disk is also encrypted.

        If your device’s storage is not encrypted, and you don’t have any type of verified boot process, then thats on you, not Signal.

        • douglasg14b@lemmy.world
          link
          fedilink
          arrow-up
          13
          arrow-down
          6
          ·
          edit-2
          5 months ago

          That’s not how this works.

          If the stored data from signal is encrypted and the keys are not protected than that is the security risk that can be mitigated using common tools that every operating system provides.

          You’re defending signal from a point of ignorance. This is a textbook risk just waiting for a series of latent failures to allow leaks or access to your “private” messages.

          There are many ways attackers can dump files without actually having privileged access to write to or read from memory. However, that’s a moot point as neither you nor I are capable of enumerating all potential attack vectors and risks. So instead of waiting for a known failure to happen because you are personally “confident” in your level of technological omnipotence, we should instead not be so blatantly arrogant and fill the hole waiting to be used.


          Also this is a common problem with framework provided solutions:

          https://www.electronjs.org/docs/latest/api/safe-storage

          This is such a common problem that it has been abstracted into apis for most major desktop frameworks. And every major operating system provides a key ring like service for this purpose.

          Because this is a common hole in your security model.

          • 9tr6gyp3@lemmy.world
            link
            fedilink
            arrow-up
            7
            arrow-down
            10
            ·
            edit-2
            5 months ago

            Having Signal fill in gaps for what the OS should be protecting is just going to stretch Signal more than it already does. I would agree that if Signal can properly support that kind of protection on EVERY OS that its built for, go for it. But this should be an OS level protection that can be offered to Signal as an app, not the other way around.

            • douglasg14b@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              5 months ago

              Having Signal fill in gaps for what the OS should be protecting is just going to stretch Signal more than it already does. I would agree that if Signal can properly support that kind of protection on EVERY OS that its built for, go for it. But this should be an OS level protection that can be offered to Signal as an app, not the other way around.

              Damn reading literacy has gone downhill these days.

              Please reread my post.

              But this should be an OS level protection that can be offered to Signal as an app, not the other way around.

              1. OSs provide keyring features already
              2. The framework signal uses (electron) has a built in API for this EXACT NEED

              Cmon, you can do better than this, this is just embarrassing.

        • uis@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          5 months ago

          Signal data will be encrypted if your disk is also encrypted.

          True.

          and you don’t have any type of verified boot process

          How motherboard refusing to boot from another drive would protect anything?

      • Redjard@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        6
        arrow-down
        3
        ·
        5 months ago

        TPM isn’t all that reliable. You will have people upgrading their pc, or windows update updating their bios, or any number of other reasons reset their tpm keys, and currently nothing will happen. In effect people would see Signal completely break and loose all their data, often seemingly for no reason.

        Talking to windows or through it to the TPM also seems sketchy.

        In the current state of Windows, the sensible choice is to leave hardware-based encryption to the OS in the form of disk encryption, unfortunate as it is. The great number of people who loose data or have to recover their backup disk encryption key from their Microsoft account tells how easily that system is disturbed (And that Microsoft has the decryption keys for your encrypted date).

    • AlexWIWA@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      5 months ago

      Mfw end to end can be compromised at the end.

      That said, they should fix this anyway

    • uis@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Indeed, End-to-End Encryption protects data between those ends, not ends themselves. If ends are compromised, no math will help you.

      • poVoq@slrpnk.net
        link
        fedilink
        arrow-up
        49
        arrow-down
        14
        ·
        5 months ago

        If your system is compromised to such an extend, it really doesn’t make much difference how the keys are stored at rest.

        • phoneymouse@lemmy.world
          link
          fedilink
          arrow-up
          35
          arrow-down
          8
          ·
          edit-2
          5 months ago

          If the keys are accessible to any process, your system doesn’t need to be compromised. All it takes is an App that you”trust” to break that trust and snatch everything up. Meta has already been caught fucking around with other social media apps on device. They even intercepted Snapchat traffic on some users devices in order to collect that data. It could be as simple as you installed WhatsApp and they went and pillaged your Signal files.

          • NekuSoul@lemmy.nekusoul.de
            link
            fedilink
            arrow-up
            12
            arrow-down
            5
            ·
            5 months ago

            All it takes is an App that you”trust” to break that trust

            I get what you’re trying to say, but that’s something I’d classify as “compromised” as well.

            • phoneymouse@lemmy.world
              link
              fedilink
              arrow-up
              13
              arrow-down
              2
              ·
              edit-2
              5 months ago

              For sure, just suggesting that “compromised” doesn’t necessarily mean you got hacked by someone because they tricked you into giving a password, or they scraped it from another website, or you installed something sketchy. It could be as simple as Microsoft scans all your files with AI, or Meta snoops other social media (which it has been caught doing).

              • Zpiritual@lemm.ee
                link
                fedilink
                arrow-up
                3
                arrow-down
                4
                ·
                5 months ago

                So you’re saying that the os itself is compromised? Gee, good luck protecting your processes from the fucking os, no matter how you do it.

  • x1gma@lemmy.world
    link
    fedilink
    arrow-up
    108
    arrow-down
    21
    ·
    edit-2
    5 months ago

    How in the fuck are people actually defending signal for this, and with stupid arguments such as windows is compromised out of the box?

    You. Don’t. Store. Secrets. In. Plaintext.

    There is no circumstance where an app should store its secrets in plaintext, and there is no secret which should be stored in plaintext. Especially since this is not some random dudes random project, but a messenger claiming to be secure.

    Edit: “If you got malware then this is a problem anyway and not only for signal” - no, because if secure means to store secrets are used, than they are encrypted or not easily accessible to the malware, and require way more resources to obtain. In this case, someone would only need to start a process on your machine. No further exploits, no malicious signatures, no privilege escalations.

    “you need device access to exploit this” - There is no exploiting, just reading a file.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      6
      ·
      5 months ago

      You. Don’t. Store. Secrets. In. Plaintext.

      SSH stores the secret keys in plaintext too. In a home dir accessible only by the owning user.

      I won’t speak about Windows but on Linux and other Unix systems the presumption is that if your home dir is compromised you’re fucked anyway. Effort should be spent on actually protecting access to the home personal files not on security theater.

        • Mubelotix
          link
          fedilink
          arrow-up
          22
          arrow-down
          5
          ·
          edit-2
          5 months ago

          Come on, 95% of users don’t set passwords on their ssh keys

        • dave@programming.dev
          link
          fedilink
          arrow-up
          10
          ·
          5 months ago

          Well yes, but also how would users react if they had to type in their passphrase every time they open the app? This is also exactly what we’re giving up everywhere else by clicking ‘remember this device’.

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          If someone gets access they can delete your keys, or set up something that can intercept your keys in other ways.

          The security of data at rest is just one piece of the puzzle. In many systems the access to the data is considered much more important than whether the data itself is encrypted in one particular scenario.

      • x1gma@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        4
        ·
        5 months ago

        Kinda expected the SSH key argument. The difference is the average user group.

        The average dude with a SSH key that’s used for more than their RPi knows a bit about security, encryption and opsec. They would have a passphrase and/or hardening mechanisms for their system and network in place. They know their risks and potential attack vectors.

        The average dude who downloads a desktop app for a messenger that advertises to be secure and E2EE encrypted probably won’t assume that any process might just wire tap their whole “encrypted” communications.

        Let’s not forget that the threat model has changed by a lot in the last years, and a lot of effort went into providing additional security measures and best practices. Using a secure credential store, additional encryption and not storing plaintext secrets are a few simple ones of those. And sure, on Linux the SSH key is still a plaintext file. But it’s a deliberate decision of you to keep it as plaintext. You can at least encrypt with a passphrase. You can use the actual working file permission model of Linux and SSH will refuse to use your key with loose permissions. You would do the same on Windows and Mac and use a credential store and an agent to securely store and use your keys.

        Just because your SSH key is a plaintext file and the presumption of a secure home dir, you still wouldn’t do a ~/passwords.txt.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      If someone has access to your machine you are screwed anyway. You need to store the encryption key somewhere

      • x1gma@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        5 months ago

        Yes, in your head, and in your second factor, if possible, keeping derived secrets always encrypted at rest, decrypting at the latest possible moment and not storing (decrypted) secrets in-memory for longer than absolutely necessary at use.

    • refalo@programming.dev
      link
      fedilink
      arrow-up
      21
      arrow-down
      8
      ·
      edit-2
      5 months ago

      How in the fuck are people actually defending signal for this

      Probably because Android (at least) already uses file-based encryption, and the files stored by apps are not readable by other apps anyways.

      And if people had to type in a password every time they started the app, they just wouldn’t use it.

      • Liz@midwest.social
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        3
        ·
        5 months ago

        Popular encrypted messaging app Signal is facing criticism over a security issue in its desktop application.

        Emphasis mine.

        • ChapulinColorado@lemmy.world
          link
          fedilink
          arrow-up
          15
          arrow-down
          3
          ·
          5 months ago

          I think the point is the developers might have just migrated the code without adjustments since that is how it was implemented before. Similar to how PC game ports sometimes run like shit since they are a close 1-1 of the original which is not always the most optimized or ideal, but the quickest to output.

          • x1gma@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            5 months ago

            Been a few days since using electron, but AFAIK electron can’t be used as a wrapper for android apps, or can it? Or is their android app a web app wrapped into a “native” android app too?

            Also, since this seems to be an issue since 2018, 6 years should be plenty to rewrite using a native secure storage…

      • uis@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        AFAIK Android encrypts entire fs with one key. And ACL is not encryption.

    • uis@lemm.ee
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      5 months ago

      You. Don’t. Store. Secrets. In. Plaintext.

      Ok. Enter password at every launch.

      • x1gma@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        Chrome cookies are encrypted, for exactly the reasons stated. If malware gains access to your system and compromises it in a way that DPAPI calls can be replicated in the way Chrome does it, then your sessions will also be compromised. But this is way harder to do, and at least prevents trivial data exfiltration.

  • Mubelotix
    link
    fedilink
    arrow-up
    63
    arrow-down
    1
    ·
    5 months ago

    Sure, I was aware. You have the same problem with ssh keys, gpg keys and many other things

  • Borna Punda@lemmy.zip
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    5
    ·
    edit-2
    5 months ago

    The backlash is extremely idiotic. The only two options are to store it in plaintext or to have the user enter the decryption key every time they open it. They opted for the more user-friendly option, and that is perfectly okay.

    If you are worried about an outsider extracting it from your computer, then just use full disk encryption. If you are worried about malware, they can just keylog you when you enter the decryption key anyways.

    • x1gma@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      2
      ·
      5 months ago

      The third option is to use the native secret vault. MacOS has its Keychain, Windows has DPAPI, Linux has has non-standardized options available depending on your distro and setup.

      Full disk encryption does not help you against data exfil, it only helps if an attacker gains physical access to your drive without your decryption key (e.g. stolen device or attempt to access it without your presence).

      Even assuming that your device is compromised by an attacker, using safer storage mechanisms at least gives you time to react to the attack.

    • Zak@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      5 months ago

      The alternative is safeStorage, which uses the operating system’s credential management facility if available. On Mac OS and sometimes Linux, this means another process running in the user’s account is prevented from accessing it. Windows doesn’t have a protection against that, but all three systems do protect the credentials if someone copies data offline.

      Signal should change this, but it isn’t a major security flaw. If an attacker can copy your home directory or run arbitrary code on your device, you’re already in big trouble.

    • refalo@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      A better thing to be worried about IMO is that Signal contains proprietary code. Also to my knowledge nobody is publicly verifying the supposed “reproducible builds” if they even still exist.

  • HappyTimeHarry@lemm.ee
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    3
    ·
    5 months ago

    That applies to pretty much all desktop apps, your browser profile can be copied to get access to all your already logged in cookie sessions for example.

    • kryllic@programming.dev
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      5 months ago

      IIRC this is how those Elon musk crypto livestream hacks worked on YouTube back in the day, I think the bad actors got a hold of cached session tokens and gave themselves access to whatever account they were targeting. Linus Tech Tips had a good bit in a WAN show episode

    • douglasg14b@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      5
      ·
      edit-2
      5 months ago

      And there are ways to mitigate this attack (essentially the same as a AiTM or pass-the-cookie attacks, so look those up). Thus rendering your argument invalid.

      Just because “something else might be insecure”, doesn’t in any way imply “everything else should also be insecure as well”.

        • douglasg14b@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 months ago

          That’s all hinges on the assumption that your computer is pwned. Which is wrong

          You don’t necessarily have to have privileged access to read files or exfiltrated information.

          That point doesn’t matter anyways though because you’re completely ignoring the risk here. Please Google “Swiss cheese model”. Your comment is a classic example of non-security thinking… It’s the same comment made 100x in this thread with different words

          Unless you can list out all possible risks and exploits which may affect this issue, then you are not capable of making judgement calls on the risk itself.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            You act as though you somehow have more knowledge than everyone else. They problem is that you don’t understand encryption and permissions. You can’t just magically make something unreadable by programs with the same permission level. If you encrypt it there will need to be a key to decrypt it. That can could conceivably be encrypted with a password but that would require someone to enter a password. If they don’t enter a password they key will be stored plain text so anyone could easily decrypt your messages. Programs running as a user have the same permissions as that user. Does that make sense? You can’t just make something selectively unreadable with the current security model. I guess you could try to implement some sort or privileged daemon but that would open up more issues than it solved.

            I would have a problem if Signal claimed that the desktop messages were encrypted at rest. However, they don’t make any such claim. If you are concerned about security I would recommend running everything in virtual machines and flatpaks. This way the chances of something misbehaving in a way that causes harm is minimized.

            • douglasg14b@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              5 months ago

              I’m not claiming some grand level of knowledge here. I also cannot enumerate all risks. The difference is that I know that I don’t know, and the danger that poses towards cognitive biases when it comes to false confidence, and a lack of effective risk management. I’m a professional an adjacent field, mid way into pivoting into cybersecurity, I used to think the same way, that’s why I’m so passionate here. It’s painful to see arguments and thought processes counter to the fundamentals of security & safety that I’ve been learning the past few years. So, yeah, I’m gonna call it out and try and inform.

              All that crap said:

              And you are right, the problem gets moved. However, that’s the point, that’s how standardization works, and how it’s supposed to work. It’s a force multiplier, it smooths out the implementation. Moving the problem to the OS level means that EVERYONE benefits from advanced in Windows/Macos/Linux. Automatically.

              It’s not signal’s responsibility, it shouldn’t be unless that’s a problem they specifically aim to solve. They have the tools available to them already, electron has a standardized API for this, secureStorage. Which handles the OS interop for them.

              I’m not arguing that signal needs to roll their own here. The expectation is that they, at least, utilize the OS provided features made available to their software.

  • thayer@lemmy.ca
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    7
    ·
    edit-2
    5 months ago

    While it would certainly be nice to see this addressed, I don’t recall Signal ever claiming their desktop app provided encryption at rest. I would also think that anyone worried about that level of privacy would be using disappearing messages and/or regularly wiping their history.

    That said, this is just one of the many reasons why whole disk encryption should be the default for all mainstream operating systems today, and why per-app permissions and storage are increasingly important too.

    • ooterness@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      3
      ·
      5 months ago

      Full disk encryption doesn’t help with this threat model at all. A rogue program running on the same machine can still access all the files.

      • thayer@lemmy.ca
        link
        fedilink
        English
        arrow-up
        18
        ·
        5 months ago

        It does help greatly in general though, because all of your data will be encrypted when the device is at rest. Theft and B&Es will no longer present a risk to your privacy.

        Per-app permissions address this specific threat model directly. Containerized apps, such as those provided by Flatpak can ensure that apps remain sandboxed and unable to access data without explicit authorization.

    • BearOfaTime@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      Exactly.

      I’ll admit to being lazy and not enabling encryption on my Windows laptops. But if I deployed something for someone, it would be encrypted.

    • Zak@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      5 months ago

      I don’t recall Signal ever claiming their desktop app provided encryption at rest.

      I’m not sure if they’ve claimed that, but it does that using SQLCipher.

    • Tywèle [she|her]@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      5 months ago

      Does encrypting your disks change something for the end user in day to day usage? I’m honest, I’ve never used encrypted disks in my life.

      • communism@lemmy.ml
        link
        fedilink
        arrow-up
        11
        ·
        5 months ago

        Whole disk encryption wouldn’t change your daily usage, no. It just means that when you boot your PC you have to enter your passphrase. And if your device becomes unbootable for whatever reason, and you want to access your drive, you’ll just have to decrypt it first to be able to read it/write to it, e.g. if you want to rescue files from a bricked computer. But there’s no reason not to encrypt your drive. I can’t think of any downsides.

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          5 months ago

          If any part of the data gets corrupted you lose the whole thing. Recovery tools can’t work with partially corrupted encrypted data.

          • communism@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            5 months ago

            I don’t think that’s a big deal with Signal data. You can log back into your account, you’d just lose your messages. idk how most people use Signal but I have disappearing messages on for everything anyway, and if a message is that important to you then back it up.

      • devfuuu@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        5 months ago

        It’s transparent for end user basically, but protects the laptop at least when outside and if someone steals the computer. As long as it was properly shutdown.

          • refalo@programming.dev
            link
            fedilink
            arrow-up
            5
            ·
            edit-2
            5 months ago

            I think they’re just referring to an outdated concept of OSes with non-journaling filesystems that can cause data corruption if the disk is shut off abruptly, which in theory could corrupt the entire disk at once if it was encrypted at a device level. But FDE was never used in the time of such filesystems anyways.

          • devfuuu@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            5 months ago

            If you suspend the laptop when moving locations instead of shutting down or hibernating to disk then disk encryption is useless.

            • thayer@lemmy.ca
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              5 months ago

              Most operating systems will require your desktop password upon resume, and most thieves are low-functioning drug users who are not about to go Hacker Man on your laptop. They will most likely just wipe the system and install something else; if they can even figure that out.

      • refalo@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        5 months ago

        It depends on how you set it up. I think the default in some cases (like Windows Bitlocker) is to store the key in TPM, so everything becomes transparent to the user at that point, although many disagree with this method for privacy/security reasons.

        The other method is to provide a password or keyfile during bootup, which does change something for the end user somewhat.

      • thayer@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        5 months ago

        No, the average user will never know the difference. I couldn’t tell you exactly what the current performance impact is for hardware encryption, but it’s likely around 1-4% depending on the platform (I use LUKS under Linux).

        For gamers, it’s likely a 1-5 FPS loss, depending on your hardware, which is negligible in my experience. I play mostly first and third person shooter-style games at 1440p/120hz, targeting 60-90 FPS, and there’s no noticeable impact (Ryzen 5600 / RX 6800XT).

        • refalo@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          5 months ago

          For gamers, it’s likely a 1-5 FPS loss

          I highly doubt it… would love to see some hard data on that. Most algorithms used for disk encryption these days are already faster than RAM, and most games are not reading gigabytes/sec from the disk every frame during gameplay for this to ever matter.

        • ruse8145@lemmy.sdf.org
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          edit-2
          5 months ago

          If it has to go to disk for immediate loading of assets while playing a video game you’re losing more than 1-5 fps

          • refalo@programming.dev
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            5 months ago

            Maybe, but not every frame while you’re playing. No game is loading gigs of data every frame. That would be the only way most encryption algorithms would slow you down.

            • ruse8145@lemmy.sdf.org
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              Yeah was thinking about that (edited to add immediate) – games are certainly background loading nowadays but the stuff needed is intended to be in ram by the time it’s needed, afaik.

            • gaylord_fartmaster@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              5 months ago

              You’re more likely going to get stuttering or asset streaming issues which are going to have more impact than losing a few fps.

          • thayer@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            5 months ago

            Yeah, I’m sure there are a lot of variables there. I can only say that in my experience, I noticed zero impact to gaming performance when I started encrypting everything about 10 years ago. No stuttering or noticeable frame loss. It was a seamless experience and brings real peace of mind knowing that our financial info, photos, and other sensitive files are safely locked away.

            • ruse8145@lemmy.sdf.org
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              For sure I’m just saying i’d guess that’s because at play time you’re loading everything into ram. For bulk loading I would encryption perf follows the general use case.

              (Tldr encryption shouldn’t matter for games)

  • Prethoryn Overmind@lemmy.world
    link
    fedilink
    arrow-up
    46
    arrow-down
    11
    ·
    5 months ago

    Ah yes, another prime example that demonstrates that Lemmy is no different than Reddit. Everyone thinks they are a professional online.

    Nothing sensitive should ever lack encryption especially in the hands of a third party company managing your data claiming you are safe and your privacy is protected.

    No one is invincible and it’s okay to criticize the apps we hold to high regards. If your are pissed people are shitting on Signal you should be pissed Signal gave people a reason to shit on them.

  • Dem Bosain@midwest.social
    link
    fedilink
    English
    arrow-up
    49
    arrow-down
    17
    ·
    5 months ago

    Why is Signal almost universally defended whenever another security flaw is discovered? They’re not secure, they don’t address security issues, and their business model is unsustainable in the long term.

    But, but, if you have malware “you have bigger problems”. But, but, an attacker would have to have “physical access” to exploit this. Wow, such bullshit. Do some of you people really understand what you’re posting?

    But, but, “windows is compromised right out of the box”. Yes…and?

    But, but, “Signal doesn’t claim to be secure”. Fuck off, yes they do.

    But, but, “just use disk encryption”. Just…no…WTF?

    Anybody using Signal for secure messaging is misguided. Any on of your recipients could be using the desktop app and there’s no way to know unless they tell you. On top of that, all messages filter through Signal’s servers, adding a single-point-of-failure to everything. Take away the servers, no more Signal.

    • Zak@lemmy.world
      link
      fedilink
      arrow-up
      45
      arrow-down
      10
      ·
      5 months ago

      If someone can read my Signal keys on my desktop, they can also:

      • Replace my Signal app with a maliciously modified version
      • Install a program that sends the contents of my desktop notifications (likely including Signal messages) somewhere
      • Install a keylogger
      • Run a program that captures screenshots when certain conditions are met
      • [a long list of other malware things]

      Signal should change this because it would add a little friction to a certain type of attack, but a messaging app designed for ease of use and mainstream acceptance cannot provide a lot of protection against an attacker who has already gained the ability to run arbitrary code on your user account.

      • douglasg14b@lemmy.world
        link
        fedilink
        arrow-up
        24
        arrow-down
        6
        ·
        edit-2
        5 months ago

        Not necessarily.

        https://en.m.wikipedia.org/wiki/Swiss_cheese_model

        If you read anything, at least read this link to self correct.


        This is a common area where non-security professionals out themselves as not actually being such: The broken/fallacy reasoning about security risk management. Generally the same “Dismissive security by way of ignorance” premises.

        It’s fundamentally the same as “safety” (Think OSHA and CSB) The same thought processes, the same risk models, the same risk factors…etc

        And similarly the same negligence towards filling in holes in your “swiss cheese model”.

        “Oh that can’t happen because that would mean x,y,z would have to happen and those are even worse”

        “Oh that’s not possible because A happening means C would have to happen first, so we don’t need to consider this is a risk”

        …etc

        The same logic you’re using is the same logic that the industry has decades of evidence showing how wrong it is.

        Decades of evidence indicating that you are wrong, you know infinitely less than you think you do, and you most definitely are not capable of exhaustively enumerating all influencing factors. No one is. It’s beyond arrogant for anyone to think that they could 🤦🤦 🤦

        Thus, most risks are considered valid risks (this doesn’t necessarily mean they are all mitigatable though). Each risk is a hole in your model. And each hole is in itself at a unique risk of lining up with other holes, and developing into an actual safety or security incident.

        In this case

        • signal was alerted to this over 6 years ago
        • the framework they use for the desktop app already has built-in features for this problem.
          • this is a common problem with common solutions that are industry-wide.
        • someone has already made a pull request to enable the electron safe storage API. And signal has ignored it.

        Thus this is just straight up negligence on their part.

        There’s not really much in the way of good excuses here. We’re talking about a run of the mill problem that has baked in solutions in most major frameworks including the one signal uses.

        https://www.electronjs.org/docs/latest/api/safe-storage

        • fuzzzerd@programming.dev
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          5 months ago

          I was just nodding along, reading your post thinking, yup, agreed. Until I saw there was a PR to fix it that signal ignored, that seems odd and there must be some mitigating circumstances on why they haven’t merged it.

          Otherwise that’s just inexcusable.

      • gomp@lemmy.ml
        link
        fedilink
        arrow-up
        16
        arrow-down
        2
        ·
        edit-2
        5 months ago

        Those are outside Signal’s scope and depend entirely on your OS and your (or your sysadmin’s) security practices (eg. I’m almost sure in linux you need extra privileges for those things on top of just read access to the user’s home directory).

        The point is, why didn’t the Signal devs code it the proper way and obtain the credentials every time (interactively from the user or automatically via the OS password manager) instead of just storing them in plain text?

        • douglasg14b@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          5 months ago

          They’re arguing a red herring. They don’t understand security risk modeling, argument about signals scope let’s their broken premise dig deeper. It’s fundamentally flawed.

          It’s a risk and should be mitigated using common tools already provided by every major operating system (ie. Keychain).

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            3
            ·
            5 months ago

            “Highways shouldn’t have guard rails because if you hit one you’ve already gone off the road anyway.”

        • Zak@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          5 months ago

          You’d need write access to the user’s home directory, but doing something with desktop notifications on modern Linux is as simple as

          dbus-monitor "interface='org.freedesktop.Notifications'" | grep --line-buffered "member=Notify\|string" | [insert command here]

          Replacing the Signal app for that user also doesn’t require elevated privileges unless the home directory is mounted noexec.

          • gomp@lemmy.ml
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            5 months ago

            I don’t see the reasoning in your answer (I do see its passive-aggressiveness, but chose to ignore it).

            I asked “why?”; does your reply mean “because lack of manpower”, “because lack of skill” or something else entirely?

            In case you are new to the FOSS world, that being “open source” doesn’t mean that something cannot be criticized or that people without the skill (or time!) to submit PRs must shut the fu*k up.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      9
      arrow-down
      2
      ·
      5 months ago

      Anybody using Signal for secure messaging is misguided. Any one of your recipients could be using the desktop app and there’s no way to know unless they tell you.

      That’s why I only communicate face-to-face inside of a soundproofed faraday cage.

      If the app manages the keys, then you can’t trust the app.

      If the recipient manages their own keys, then you can’t trust the recipient.

      Encryption is fundamentally insecure. Once I encrypt something, nobody should be able to decrypt it ever again.

    • refalo@programming.dev
      link
      fedilink
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      5 months ago

      98% of desktop apps (at least on Windows and Linux) are already broken by design anyways. Any one app can spy on and keylog all other apps, all your home folder data, everything. And anyone can write a desktop app, so only using solutions that (currently) don’t have a desktop app version, seems silly to me.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      5 months ago

      Now replace “signal” in your comment with “ssh” and think it over.

    • Dessalines@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      5 months ago

      Basically for the same reason people often defend apple: the user interface is shiny, and they claim to be privacy oriented.

      Signal is a centralized US hosted service, that alone should be enough to disqualify it, outside of our many other criticisms.

    • uis@lemm.ee
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      5 months ago

      But, but, “just use disk encryption”. Just…no…WTF?

      So not encrypting keys is bad, but actually encrypting them is bad too? Ok.

      Any on of your recipients could be using the desktop app and there’s no way to know unless they tell you.

      Another applefan? How it THIS supposed to be in scope of E2EE? Moreover, how having a way to know if recepient is using desktop app is not opposite of privacy?

      On top of that, all messages filter through Signal’s servers, adding a single-point-of-failure to everything. Take away the servers, no more Signal.

      Indeed. This is why I use Matrix. Also, fuck showing phone numbers to everyone(I heard they did something about it) and registration with phone numbers.

      • Dem Bosain@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        5 months ago

        Any “secure” so that relies on someone else for security is not secure.

        Fuck the scope of E2EE. Signal makes a lot of claims on their website that are laughable. The desktop app is their main weakness. Attachments are stored unencrypted, keys in plaintext. If they were serious about security, they would depricate the windows app and block it from their servers.

        WTF does Apple have to do with anything?

        • uis@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          5 months ago

          Any “secure” so that relies on someone else for security is not secure.

          Fuck the scope of E2EE.

          When someone has FSB/NSA agent behind them reading messages, no amount of encryption will help. Biggest cybersecurity vulnreability is located between monitor and chair. When you are texting someone else, that someone else’s chair-monitor space is also vulnreable.

          Signal makes a lot of claims on their website that are laughable.

          Well, maybe. I didn’t read their claims, nor I use signal.

          Attachments are stored unencrypted, keys in plaintext.

          Is OS-level encryption plaintext or not? If yes, then they are encrypted, provided user enables such feature in OS. If not - nothing if encrypted fundamentally.

          If they were serious about security, they would depricate the windows app and block it from their servers.

          WTF does Apple have to do with anything?

          You just used applefans’ argument. Yeah, I wonder what.

    • SeattleRain@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      8
      ·
      5 months ago

      What app stops a pre install keylogger. I’m all for hearing criticism of Signal but it’s always about things they can’t control.

  • jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    27
    ·
    5 months ago

    The real problem is that the security model for apps on mobile is much better than that for apps on desktop. Desktop apps should all have private storage that no other non-root app can access. And while we’re at it, they should have to ask permission before activating the mic or camera.

    • Pussista@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      5 months ago

      macOS has nailed it*, even though it’s still not as good as iOS or Android, but leagues and bounds better than Windows and especially Linux.

      ETC: *sandboxing/permission system

        • Pussista@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          5 months ago

          It’s a joke. Apps have defined permissions already allowed on install and some of them have too many things set to allow like home or host access. Also, changing any permission requires restarting the app. It’s heading in the right direction, but it has a looooong way to go to catch up with macOS, let alone Android and iOS.

      • tmpod@lemmy.ptM
        link
        fedilink
        arrow-up
        3
        ·
        5 months ago

        What does Windows do? Genuine question, I’ve not used it since the 7 days. Regarding Linux, that’s true for stuff installed through regular package managers and whatnot, but Flatpak is pushing a more sandboxed and permission oriented system, akin to Android.

        • ruse8145@lemmy.sdf.org
          link
          fedilink
          arrow-up
          3
          ·
          5 months ago

          You have granular control over universal windows apps (ie windows 8+ apps) and one global lock over all desktop apps (non uwp), and one global lock over everything. It’s pretty solid considering how little control Microsoft has and it’s wonderful fetish for compatibility.

          Tldr basically same as Linux, except app distribution in Linux was bad enough for so long that more stuff is in the new restricted format while windows still has tons of things which will never go away and aren’t in the sandbox. I think not finding a way to sandbox all desktop apps was a mistake.

    • refalo@programming.dev
      link
      fedilink
      arrow-up
      5
      ·
      5 months ago

      Firejail and bwrap. Flatpaks. There are already ways to do this, but I only know of one distro that separates apps by default like Android does (separate user per app), which is the brand new “EasyOS”.

  • ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    5 months ago

    Whatever its stores and however it stores it doesn’t matter to me: I moved its storage space to my ~/.Private encrypted directory. Same thing for my browser: I don’t use a master password or rely on its encryption because I set it up so it too saves my profile in the ~/.Private directory.

    See here for more information. You can essentially secure any data saved by any app with eCryptfs - at least when you’re logged out.

    Linux-only of course. In Windows… well, Windows.

    • uis@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Or ext4 encrytion. Which is overpowered. You can have different keys for different files and directories.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    4
    ·
    5 months ago

    Bruh windows and linux have a secrets vault (cred manager and keyring respectively, iirc) for this exact purpose.

    Even Discord uses it on both OSs no problem

  • sntx@lemm.ee
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    5 months ago

    I have three things to say:

    1. Everyone, please make sure you’ve set up sound disk encryption
    2. That’s not a suprise (for me at least)
    3. It’s not much different on mobile (db is unecrypted) - check out molly (signal fork) if you want to encrypt it. However encrypted db means no messages until you decrypt it.
  • Majestic@lemmy.ml
    link
    fedilink
    arrow-up
    25
    arrow-down
    11
    ·
    5 months ago

    There is just no excuse for not even salting or SOMETHING to keep the secrets out of plaintext. The reason you don’t store in plaintext is because it can lead to even incidental collection. Say you have some software, perhaps spyware, perhaps it’s made by a major corporation so doesn’t get called that and it crawls around and happens to upload a copy of a full or portion of the file containing this info, now it’s been uploaded and compromised potentially not even by a malicious actor successfully gaining access to a machine but by poor practices.

    No it can’t stop a sophisticated malware specifically targeting Signal to steal credentials and gain access but it does mean casual malware that hasn’t taken the time out to write a module to do that is out of luck and increases the burden on attackers. No it won’t stop the NSA but it’s still something that it stops someone’s 17 year old niece who knows a little bit about computers but is no malware author from gaining access to your signal messages and account because she could watch a youtube video and follow along with simple tools.

    The claims Signal is an op or the runner is under a national security letter order to compromise it look more and more plausible in light of weird bad basic practices like this and their general hostility. I’ll still use it and it’s far from the worst looking thing out there but there’s something unshakably weird about the lead dev, their behavior and practices that can’t be written off as being merely a bit quirky.

      • Majestic@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        I mean combined with any kind of function, even a trivial kind. A salt derived from some machine state data (a random install id generated on install, a hash of computer name, etc) plus a rot13 or something would still be better than leaving it plaintext.

        • uis@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          Malware has access to it.

          If fs is not encrypted, then malicious hardware(FSB agent’s laptop) also has access to it. If encrypted, then it we are back to statement many people told here about encrypting fs.

          plus a rot13

          That’s not salting.

    • notannpc@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      3
      ·
      5 months ago

      Obviously the keys could be stored more securely, but if you’ve got malware on your machine that can exploit this you’ve already got bigger problems.

      • douglasg14b@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        8
        ·
        edit-2
        5 months ago

        That’s not how this works.

        This sort of “dismissive security through ignorance” is how we get so many damn security breaches these days.

        I see this every day with software engineers, a group that you would think would be above the bar on security. Unfortunately a little bit of knowledge results in a mountain of confidence (see Dunning Kruger effect). They are just confident in bad choices instead.

        We don’t need to use encryption at rest because if the database is compromised we have bigger problems” really did a lot to protect the last few thousand companies from preventable data exfiltration that was in fact the largest problem they had.

        Turns out that having read access to the underlying storage for the database doesn’t necessarily mean that the database and all of your internal systems are more compromised. It just means that the decision makers were making poor decisions based on a lack of risk modeling knowledge.


        That said the real question I have for you here is:

        Are you confident in your omniscience in that you can enumerate all risks and attack factors that can result in data being exfiltrated from a device?

        If not, then why comment as if you are?