So many security controls. So much noise. It is a wonder we find anything at all relevant. The amount of time spent going through log after log is amazing. Even with SIEMs, dashboards, machine learning and “AI” there is still a ton of logs to go through. Yes we can whittle away once we know what we are looking at or looking for. The problem is that we have no good way of determining if we are missing something even with machines helping us.
Example: you have product X which uses behavior of files it knows vs files it does not know with Tactics, Techniques and Procedures it is aware of to make a determination on what to do about said file. This along with known hashes of known malware give said action a score. The problem: even using known files in this fashion produces a lot of false positives, even though their scores might be lower. Example: Outlook uses a known third party plugin for encryption/decryption of e-mail. These constantly show up as a monitored low level event because of memory calls and invoking the plugin. Problem: same technique can be used by malicious software/scripts using Outlook as the vector in through a vulnerability triggers by a malicious e-mail. Now what we have is that attack lost in a sea of false positives. Depending on the product, you may be able to tune out the false positives only, but not always. The tuning becomes too broad and you would lose alerting on the attack due to tuning out the Outlook invoking another program alert as something that is expected, or not getting any of these at all.
So what is my point? Products, especially security products at times either wind up generating too much data without enough control, or generate not enough data with a huge amount of complexity to them to open up the data. It seems to be get flooded out or starve for the information. We the spend more time tuning said products to our environment than we probably should have to. Part of this problem is the fast pace state of updates and upgrades causing hash changes and sometimes behavior changes. Part of it seems to be the rush to get the product to market. Speed kills, in so many different ways, to the point of burning us out. Computer intelligence is getting better, but we still need plenty of eyes on the issues. We need more “entry level” analysts and SOC workers to go through the data and tune the flow. Instead, companies are focus on hiring higher level engineers and architects. Everyone wants to jump up fast, and that leaves the higher levels positions doing jobs they don’t want to/should not have to do. This leads to job hopping, and burnout, and then more openings. It is a vicious circle that needs to be broken. Maybe we need to start off with less noise and no increasing the amount of products until we have the resources or the current products we use tuned properly.
What are your thoughts?