Security Saturday!
Personally I hate it when people get all muddle-minded and think that their political opinion matters just because they have a platform in some other branch. I have the wonderful privilege of your audience and I thank you for it. What I will talk about today applies to both security and politics but will not touch any sane political framework. Your beliefs are safe, this will not be politically charged, it will just hint at a mental tool also usable for politics.
If what I write here today seems at odds with your politics or beliefs, you are welcome to comment. It does not mean that your politics or beliefs are wrong, it probably means they are not articulated well. Because what we talk about today is: Darkness versus Light.
Classic, I know.
In software development there are terms like white-box, grey-box and black-box testing (which I touched upon in earlier posts). White-box means you have access to the code and the program in action. Grey box means you have a blueprint or general description of the program, and access to the program in action. Black box means you have no real idea about what should be going on, you only have the program in action itself.
In security we can do several similar things when securing applications. We can have a veil of darkness (call it encryption of some form), a glorious daylight (no encryption whatsoever) or a dull grey (encryption where it matters). Similarly with our politics and beliefs we can have a veil of darkness (we don't know what we believe other than intuitively), a glorious daylight (we know exactly how things are, and no-one can say otherwise) or a dull grey (we know what we believe, but some things are clear and others are not).
In all of these situations I am going to argue that the grey is the best place to be. The best security comes from knowing what you know, and being aware of what you do not know. Knowing everything is often a maze of rules and white rabbits, while knowing nothing leaves you without direction.
A practical example is when perusing logs for something that you know happened, but you are trying to estimate the impact of. At work we had this when we discovered a bug that happened when someone would press "agree" on a warning message, and that would trigger the exact opposite behaviour of what the warning message said it would. Since this was in our access control subsystem it meant that information was disclosed to people that should not see that information, and as such we needed to know who fell victim and needed to be informed and remedied. That was when we discovered that we did not know what we did not know, it was a black box. We had logs, but they were extensive. The exact confirmation action itself was not logged properly because of the rewriting of some legacy code (which is an issue onto itself). In the end we derived the information we needed from some other logs, but it took a while and was far from painless. Needless to say we staged a fix that would give us more log precision in the next version of our application.
A practical example in the other direction comes from my personal experience at work (with this one I was directly involved). I am not a developer at my current Job, but an analyst. This means that I am responsible for keeping an eye on logs and other data sources to make sure that things are not derailing. I am also responsible for following up on antivirus hits, phishing mails and documenting of past incidents. In this line I was asked to find out why a specific email was not being delivered to customer.
After looking for a while I found an answer, and it was the wrong customer who also did not get their emails. In this case we were too white-boxed. I had access to too much information and was blindsided. Since I did not know what exact type of failure i was looking for, I did not filter well enough and ended up giving my boss's boss some very weird advice (my direct boss was out, I was basically the only security guy at the office...). After figuring out what I did wrong I worked together with the support desk employee to find the correct email.
These examples give us an idea about sanity and light and darkness in thoughts, beliefs and politics as well. if everything is clearly defined, you are likely to get an information overload and miss the important information that you actually need. On the other hand, if your beliefs, thoughts and politics are like a black box only running on intuition you are likely to completely miss the information you need to actually make an informed decision.
Therefor in all these situations my view is that you need to be informed about the right things. Know a little bit about agriculture before you legislate on it. Know a little bit about how science works before you put your politics on top of it. No need to get into the details and get yourself a whiteout, but do add some light to your darkness so that you can see under the pleasant light of an overcast day.
I am sure this works on even more areas of life, stay safe out there!
There is so much more to this one chapter, but it is so good already!
I had to cut it short because guests arrived, but this should get you started on your own study :)
@calvinrempel Thank you once again for the Theology Tuesday you did, I refer back to it in this one :)
@JamesDerian Congratulations with your Marriage :)
Next time there might (almost certainly) not be a Theology Tuesday, so the official next one will be February 22nd! I have a marriage to attend. As the groom. Our home is still half a project.
Fun times!
This is the third corner to have persistent discussions and talks in. I love tech, but especially once it transcends hardware a little. I have two degrees; a bachelor's in Software Engineering and a master's in Information Security Technology. My graduation thesis focused on assembly-level optimizations (that is, one level above the hardware level) and my free subjects were in formal verification. This is why I love programming in the security corner, or maybe it is the other way around.
I started going down the Security path because I early on saw that the world around us would become a dangerous cesspool of badly-implemented and hostile tech. Now I am one of the people that understands the field around that mess :)
So in here you can discuss secure phones, weird programming languages, sad truths about internet-connected fridges. Also about malware, adblockers, and so on and so fort!
A lot of tech talk I do over at the @Lunduke community, where a lot of nerds hang out and it is ...
Much like the reading corner, let's have a music corner! A few rules for this one, since some music can be provocative. I don't mind much but let's keep youtube links with risque thumbnails out of here.
Other music I might also mind. "Do you find that offensive?" might someone ask. Yes, there is some music I choose not to listen on principle, and I walk a thin line there sometimes. But do not worry, I have a wide taste otherwise so feel free to share almost anything :)
Either way, here is the music corner!
Many times when we talk about security, we mean to say "Digital security". In essence we mean to say that our hardware and software that we use stays safe no matter what we do. And even though the ISO27001 standard (and by extension, for example, the NEN7510 standard) make it abundantly clear that security is a people-domain problem, we usually take that as a process-like truth. Meaning, we think that being secure is a matter of regulating people.
The truth is very different. For example, while writing this I am pretty shot. I slept five hours and I an under influence of a bunch of painkillers and some alcohol. Before you ask what I was thinking, let me mention that I have a genetic defect in my spine that I am dealing with right now by taking measured doses of all three (and yes, to get the Bible into this conversation, there is even a biblical ground for the inebriation with alcohol - see proverbs and the letters to Timothy - , although I did not use red wine. But hey, I am still on top of ...