As Steve wrote last week, Small World News is developing a new mobile multimedia reporting app for Android. This app will also include an entire mobile multimedia reporting curriculum, including journalism and digital safety and security basics. We will be adapting our Guide to Safely Producing Media where appropriate and working with our colleagues John Smock and Mark Rendeiro to produce the photo and audio reporting lessons.
I have been tasked with producing the security curriculum for SWN?s new mobile app. Because of that, I?ve spent quite a bit of time the last few weeks reading a variety of existing digital security and safety guides.
How much security or privacy is ?enough? tends to raise consistent and sometimes rancorous debates within the relatively small community of digital security & safety researchers and trainers. Combine this with the large number of safety and security guides currently available, and the lack of any central review mechanism and you start to have an idea why this has been a tough few weeks.
While reviewing the tools available, and what the guides say about them, I?ve come to realize there are potentially fatal flaws even in a few relatively well-regarded tools. The curriculum I?m working on is focused on mobile-based reporting, so most of the tools and techniques I?ve been reviewing are related to mobile, and all are related to digital/online communications.
My observations thus far are that the issues with training users to improve the security and safety of their communications generally fall into a couple of camps.
Security issues related to the ?design? of a tool.
Recently I realized that TextSecure, the most well-regarded secure texting app for Android has a serious security hole, due to the application being designed for ?usability.? TextSecure is a tool that saves your text messages in an encrypted form, and allows two users of the app to send encrypted messages to each other that are decrypted on the recipient devices.
To increase the usability of the app it has a feature described by one of the developers as ?in focus.? Understandably, a user of TextSecure would not want the passphrase used to decrypt a message to stop functioning while the message is being read, and thus ?in focus.? Unfortunately, as the app currently functions, if I lock my screen while reading a message, or while the list of all available messages are open, the passphrase cache does not clear based on the preset timeout.
This is a huge issue because it also means if an activist or journalist is in the middle of reading/reviewing/deleting texts and detained, the most natural action for a user is to enable the screen lock. In the case of TextSecure, you will have to quit the program before enabling the screen lock if you want to release your passphrase.
This is a case where a tool being designed for usability actually puts the user at risk, and the technique necessary to ensure safety is not intuitive and thus has low usability.
Although TextSecure is a tool designed for increasing a user?s privacy and security, the tendency for non-security tools and technologies is to prioritize usability above all else.
Security issues related to simplicity of the tool, or users? clarity as to how it works.
The largest general threat to users safety is due to a lack of understanding about how a tool works. Sometimes a tool recommended by many may have a very simple flaw that may be overlooked by the user.
Truecrypt is a tool for encrypting hard drives that is relatively well accepted by the security community across the board. However Truecrypt has a fairly simple flaw, that technologists may generally be aware of, but is often overlooked in the training literature. According to Truecrypt?s site:
When a file-hosted TrueCrypt container is stored in a journaling file system (such as NTFS), a copy of the TrueCrypt container (or of its fragment) may remain in the free space on the host volume. This may have various security implications. For example, if you change the volume password/keyfile(s) and an adversary finds the old copy or fragment (the old header) of the TrueCrypt volume, he might use it to mount the volume using an old compromised password (and/or using compromised keyfiles that were necessary to mount the volume before the volume header was re-encrypted). Some journaling file systems also internally record file access times and other potentially sensitive information. If you needplausible deniability, you must not store file-hosted TrueCrypt containers in journaling file systems. To prevent possible security issues related to journaling file systems, do one the following:
? Use a partition/device-hosted TrueCrypt volume instead of file-hosted.
? Store the container in a non-journaling file system (for example, FAT32).
That?s a long way of saying, if you want to use TrueCrypt safely, you need to be sure to use a non-journaling file system for the drive where you wish to use TrueCrypt.
Until recently, I was not aware of this serious issue. The Protektor Services and FrontlineDefenders Digital Privacy manuals do not cover this issue, yet TrueCrypt is now considered a standard tool by most organizations doing training. I recently assisted a training on TrueCrypt, yet I was unaware of the flaw. This tells me, even if the issue was covered (I am fairly certain it was not) it wasn?t covered well enough that I, a relatively knowledgeable user, picked up on the issue.
When some of the most well accepted tools have serious flaws, and despite the existence of very simple solutions, these solutions are not made clear in the standard security training literature, we have a problem.
Security issues related to the usability/accessibility and actual functionality of tools to common users.
This is the elephant in the room. Users will not adopt tools that lack usability, no matter how high their need for security. This fact appears to have been key to the creation of Cryptocat, a relatively new security tool, which has recently been quite a bit in the media spotlight.
The problem with Cryptocat, as many have pointed out, is that usability may lead directly to broad user adoption, and users? lack of understanding of how the technology works means they won?t be aware of the threats posed by Cryptocat.
Cryptocat aims to ?Provide Universally Accessible Encrypted Instant Messaging.? Unfortunately, at the moment Cryptocat does this primarily through the use of SSL, which is far from a guaranteed measure of security. It also does not ?anonymize you,? nor ?protect against keylogggers? or ?untrustworthy people.?
The good news is that Cryptocat seems to be providing a good lesson to the community. There has been a really engaging discussion on Stanford?s LiberationTech mailing list regarding the responsibilities of security researchers and trainers to the wider community of users. Nadim, the lead developer of Cryptocat, is taking on the difficult task of developing a cryptography project that is usable, clear to the user, and truly secure.
Lastly in the world of mobile security, individuals seem to be increasingly aware of the manifold risks posed by their phones and smartphones. However, there is one major flaw that is seldom discussed. This flaw is, essentially, the difficulty posed by the specific technology used to store data on smartphones. As my colleague Nathan Freitas put it recently:
Training orgs must ensure that they teach people ?How to smash a smartphone into a thousand pieces using a heavy lamp and flush it down the toilet,? as standard curriculum.
Fortunately there?s another option for users of the latest Android OS phones: Full Disk Encryption. The implementation combines usability, clarity for the user, and high functionality. These factors all combine to ensure that Full Disk Encryption can and should be recommended as a standard practice for activists, journalists, and privacy advocates who have access to phones running Android 4.0.
I hope the curriculum we are producing will meet these requirements as well. Moving forward, I?ll focus on a simple test to determine to what degree a manual or tool provide for the safety of the user. Measuring the usability, clarity, and actual secure/private/anonymous functionality of a tool should determines the degree to which one should depend on it for safety.
Usability + Clarity + Functionality = Safety.
Blog, Featured, Small World News Blog
Source: http://smallworldnews.tv/featured/how-much-security-is-enough/
2012 grammy nominations stephen sondheim los angeles news grammys 2011 mike leach mike leach billy graham
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.