"Organizations are now in a race against time to figure out if they have computers running the vulnerable software that were exposed to the internet. Cybersecurity executives across government and industry are working around the clock on the issue."
""For most of the information technology world, there was no weekend," Rick Holland, chief information security officer at cybersecurity firm Digital Shadows, told CNN. "It was just another long set of days.""
The sysadmin subreddit is also full of professionals talking about the problem.
With so many large scale hacks, 0-days, and breaches happening these days, are cybersecurity professionals ok? Have studies about the mental health and anxiety levels of this group of professionals been conducted?
Personally, I am helped by being a client server. It constantly amazes me the kind of risk a business is willing to accept for barely anything in return, and if I actually had a personal stake in what I am seeing at clients I think I would be much more stressed.
Also, the issue is not just 'are security professionals ok'. Good security starts with good operations, and good operations is a rarity. We need devs to ship products that can run with least privileges and have secure defaults. We need operators to have a good understanding of their own environment and to design things on purpose rather than just improvising. We need security people that can offer more guidance than just printing out a nessus scan. We need business analysts who are pragmatic with concessions and who are willing to spend the resources needed to do things right the first time.
The landscape is desperate with hundreds of apps, servers, networks, employees etc. in most companies. The tooling is either difficult or expensive or both - how many people allow outgoing firewall traffic by default because it is too complicated to whitelist everything that needs to go out? Even with the best will in the world, basic things like Windows updates, SSL cert updates (no we can't all use Lets Encrypt), Linux updates can be a full-time role.
But to be fair, our industry is still quite immature. We do not have the regulatory backup across the globe to assure all products are developed securely etc. We use open-source libraries with barely any checks (and what would we check anyway?) and IT is mostly intangible so how do we even know what we are doing anyway?
This sounds doom and gloom but I think every industry has a Wild West stage that teaches people what is and isn't important and allows industries and products to mature eventually to something that is sustainable.
My organization takes security seriously but and the end of the day we serve customers who don’t. That’s been the bulk of our issues this year.
log4shell has just been the icing on the cake.
Cybersecurity is SUPER broad though and there’s a range of many different roles. I’m a bit surprised all the folks here saying they’re not okay. Best of luck to them and maybe it’s time for a role swap.
I used to work in a SOC so I get it and probably would be struggling if I was still there.
1. Not Budgeted for.. 2. To Much Down time (Excuse on systems even, that are fully load balanced).. 3.. WHHHHHHNNN But we need that legacy system (Which is real, because if it goes down the whole network does)... 4. This doesn't sound critical, let's bring this up next year. 5. Because I (Non-techy boss man) said we're not going to do it. As a matter of fact I am going to sue: HIPPPA, OR PCI, OR Some Reg agency for governmental overreach of a proper business.
Am I okay?
On the other hand my colleguaes who look for hackers, do forensics and help customer who are/have been attacked are having the time of their life. Rarely do you see people as excited about their works as these guys during this weekend.
I feel bad for anyone working in an org that doesn't have the ability to proactively find bugs in their stuff. Lots of places are cheap and don't account for the debt they accrue by not updating their systems. 'if it aint broke don't fix it' doesn't apply to software. What you are shipping is always broken. 'safe' languages are written in unsafe languages. The interpreter has bugs, the vm has bugs, the virtualization stack has bugs, the OS has bugs, the libraries to do everything have bugs. What is there and what is known are both moving targets. If you are not hiring the offensive minded individuals who will find the bugs with or without your support then you will not know about the bugs until they are out in the wild. If you aren't willing to pay those people you are accruing debt that will come due later.
There has been serious underspending by companies for cybersecurity for at least a decade now. Companies are slowly waking up to the fact that the security team can't be less than 1% the size of the development team.
Companies have let developers do whatever they want for so long, that when infosec comes in and says we need to change this so we have better visibility in to what is being used, or how, it's "Oh this will hurt productivity, so no".
The shit I have heard because companies don't want to spend money on cybersecurity, because putting out new features is more important than something that "might" happen. They just keep spending more on endpoint security and letting everything inside do whatever it wants.
and why would they? Bad hacks blow over after a year or two. Equifax is still ticking along, so is Citi bank, so is capital one. Nobody cares if you get hacked, just pay a fine and give it some time and things will go back to normal.
Saying bottleneck implies there's expectation of a better future ahead, but so far there's very few repercussions for neglecting this stuff and so it's unclear whether it'll improve or just become worse.
Large orgs should already have sufficient documentation as to which packages and versions are in use and what systems pulled them from their proxy repo.
I know I'll see half of y'all online with me :-/
Developers make products: they are indirect profit centers, and while everyone sees room for improvement, get treated relatively well.
Conversely, outside of areas like say finance, big tech, & gov, sec teams get starved and ignored as cost centers. Their event log DBs (SIEMs) are often from 15+ years ago and might even be SQL-based (think MySQL, not bigquery), if they even have one.
Not fun even before all this - automated attacks with bad support has been going on for years.
https://twitter.com/SwiftOnSecurity/status/14694518560910090...
https://twitter.com/SwiftOnSecurity/status/14703213781550202...
https://twitter.com/SwiftOnSecurity/status/14703215750145802...
https://twitter.com/SwiftOnSecurity/status/14704312736342179...
https://twitter.com/SwiftOnSecurity/status/14707958604349030...
FUD, bullshit, lack of skilled people, lack of budgets, lack of understanding from adjacent departments, chaos, mayhem, overtimes, incidents and creeping "I'm not sure what's going on" have always been parts of the profession. Learning to accept frustration, constant change, ill-formed perception and rejection is part of your career choice and a selection factor in the long term. Learning to look at the world from a certain angle which is hard to unlearn (especially if you're good at it) is a mental equivalent of firefighter's calluses.
If you can bear with it all - being on defensive side and being a kind of digital first responder (regardless of where exactly you are in the industry) is a fun job and calling for some.
(Edits: Typos)
Now, I write software for a niche Information Security team at a Fortune 250. I am still impacted by these major events, but it is usually in the form of "can we add X detection for Y vulnerability". The work is challenging/enjoyable but I am still able to maintain a good work/life balance.
Yet here we are, talking about burnout with more than a few people ITT _still_ talking about teams being understaffed.
Meanwhile there is no shortage of people obtaining credentials and education in cybersecurity who have nowhere to go because many parties feel these are effectively worthless. So what's the answer when the onus of training has shifted to workers, but none of the major players want to develop these people.
[1] https://www.csis.org/analysis/human-capital-crisis-cybersecu...
This has me thinking no one cares anymore about the stack of software being used - it’s almost sad to think about how it is going to be get worse with insane dependencies software comes with nowadays.
I’m thinking there’s a business opportunity for someone to curate list of software and it’s dependencies and provide subscription service for vulnerabilities alert on the stack of said software.
PS: Looks like Vuldb [0] is halfway there, but wonder if they alert based on dependencies or even version of said application / product.
At my previous job, everything fell on deaf ears and I wasn't given access to the things I needed to make sure things were secure. I couldn't access Jira board for the projects I was supposed to secure to create tickets for security remediations, and issue reports were ignored.
Where I'm at now is better. I can create the tickets, but they tend to get ignored until I tag the PM for them to get assigned, after which they typically get fixed pretty quickly.
Reminder that this exists: https://paulbellamy.com/vulnerability-name-generator
At some point we need to stop the coddling. Sink or swim people.
¯\_(ツ)_/¯
As an industry? We are getting better. Better recognition for neurodiversity, and the needs and capabilities for each other. Better at calling out the bad actors who can be poison pills in our teams, companies, and communities. Better at mentoring folks who are coming up in the industry.
None of that changes the disastrous toll that the security industry can take on individuals. Incident responders and investigators often have to deal with many of the psychological issues that police officers have to (exposure to CSAM, graphic imagery, deep knowledge of cybercrime and the relationship to real world crime, including human trafficking, drug trafficking, etc). The stress of often working in sensitive roles where you can't share or talk about things, and in the case of many colleagues (at least in Canada and the United States), having access to work approved "counselling services" as opposed to actual mental health professionals with a direct clinician-patient relationship.
Many IT Security folks work in teams that are under resourced, function in "Teams" of one, and are often expected to fulfill the roles of SRE, IT Support, Architecture, Engineering, and Compliance for any technologies or tools they use to do their jobs, in addition to actually doing their jobs.
It gets to be a bit much at times :)
> Have studies about the mental health and anxiety levels of this group of professionals been conducted?
I don't know if there are any formal, rigorous, academic studies, but mental health in the cybersecurity space has been a long discussion, with the earliest significant discussions I remember being back in 2008-2009 at some conferences.
Depending on the role and the organization it can be a high stress job. I often tell a story at talks and conferences about my 72 hour work day, which was my first incident response activity early in my career. It involved life critical and personal safety systems affecting thousands of people, and ended up working with the team for literally 3 days straight with no sleep because there was no one else in our team that had the perspective on how to address the issues. There was serious physical fallout, as well as mental issues (I developed aphasia for a short period of time, amongst other things) Many of the folks I know in the industry have similar (but not as grim as a full 72 hours), and that's setting aside the toll of building a career on telling people that there are big issues, and having to be helpful and constructive instead of saying "I told you so..." when things go poorly.
Yes. Risk is the primary part of the job, and the job is lucrative.
In other words, it's business as usual.
The major contributing factors for me:
- Reactive panic instead of proactive strategy
- - Detail: The suits in most economic sectors have zero interest in investing in security and best practice ahead of time, preferring to sell buzzwords to customers, and engaging in actual security only reactively. This leads to lots of scrambling and working weekends for cybersecurity professionals.
- - Example: can't tell you how many times I heard companies say "AWS handles our security." This is virtually never true, and the realization of the fallacy of this idea led to many panicked board meetings and embarrassing disclosures.
- Budget by buzzword
- - Detail: The latest cyber machine learning web 2.0 agile next gen thing will always get the funding, while tried and true practices like a good NIDS/HIPS pairing and a robust incident response process are relegated to the budget back seat
- - Example: Much of my time was for a government agency, and it was consistently easier to get funding for security hardware than for people. As a result, we often ended up with amazing, sexy, buzzword-covered hardware, and absolutely nobody (or nobody trained) to actually monitor it or analyze the data it produced.
- Ambulance-chasing parasites
- - Detail: Name a major cybersecurity incident that hit the public consciousness between 2015 and 2020, and I can name at least a dozen startups that cold called me over the next three months offering to address the problem for our organization, and prevent it the next time. These ambulance-chasing companies offer very little real value, and often drain resources from real improvements in an organization's cybersecurity posture.
- - Example: WannaCry/Eternal blue. I received months and months of aggressive marketing promising to fix this across our org, to the point that dodging the spam calls actually interfered with the on-the-ground work of patching endpoints and monitoring for IoCs.
This was my own experience, and may well not be representative of what others have gone through... but after years of 2am NOC calls, all-weekenders, being treated with disdain by other disciplines, and watching org funds consistently get flushed away on snake oil, I threw in the towel and switched industries. I took two decades of IT experience with me when I left. Now I'm paid better, and I'm in a much better place health-wise.
As a client platform engineer for the last couple years, patch management / versioning is one of my primary job duties. Are there cybersecurity “professionals” out there enjoying hefty pay raises for doing a job someone at their organization is doing for them?
Perhaps I should change my job title to “Cybersecurity Analyst” and get a extra $30k per year for the same thing I’m already doing.
My stock answer was "yeah, well, it depends on the day: On a good day, you're saying the same thing to new people. On a bad day, you're saying the same thing to the same people."
Those repeat conversations were often after an incident. The IF you can report to a interested and motivated executive with pull and credibility and IF the organization has risk governance and understands its risk tolerance and IF people understand the differences in various degrees of injury and IF people understand the differences between mission-critical and mandate-vital OR people are at least willing to learn what these things are AND willing to put in place the structures that support analysis, remediation, continually, forever, THEN, yes, it can be very satisfying. (I taught for CSE for a few years, from beginner-level intros to in-depth multi-day courses for SMEs. I almost always made sure to include in every deck that one slide that showed the relationship between law, enabling legislation, central policy, departmental policy, governance, and risk tolerance and management. I'd say something along the lines of "if your department/agency/organization/unit does NOT have this, then the first document you should prepare is your CV....) As to me, I'm doing OK, because I joined a cyber startup 1.5 years ago. Our hardware product is a unidirectional gateway (think diode with extra management capabilities on either side), our software product is a wicked cool (if I do say so myself) IRM/GRC/ComplianceManagement engine with integrated visualization and workflow, so I get to do really cool stuff all the time. And fortunately, we aren't subject to this one, so no one lost their weekend. Well, not because of this. But only OK, because a) plague, b) some of the lovely bonus situations that come with age (related to both one's health and the health of one's loved ones), and c) startup, with all that implies, especially with A and B. Fortunately, I have great colleagues and a fantastic boss, so that makes it easier, but, still, it's a lot.
mid level Managers not being all that helpful since they have little understanding of our job. we are understaffed, underfunded.
high level managers, executives, continuing to push more unreasonable objective. targeting non sensual numbers such as 40% operational margin, and double digit growth.
HR more interested in us adhering to inclusive language at work and whatnot, than our mental well being and productivity.
our company threatening all employees for compliance with vaccine mandate in the US (for sure some employees will leave, causing further understaffed issues).
number of attacks on the rise, probably because more people stuck at home converting to being cyber criminals.
more politics at work, more junior engineers playing the politics game to shine rather than slowly sharpen their technical skills.
more engineers who do most of the work started to give up. simply logging in, attending meeting pretending to do all they can, but in fact do the minimum they can get away with. hence more pressure on those who haven't given up yet.
Cyber attacks have always been on the rise, and they will continue to rise in numbers as we continue to digitalise our lives, but during this last couple of years, I've seen a drastic descent into mismanagement, and increased employee pressure in order to increase profit, which led to the opposite result in absolute productivity.
I only expect things to get worse before they get better. Someone daring to make these remarks at the workplace could easily cost him his job, at least he would become ostracised by management for spreading negativity and exaggerations.
I don't think this is only happening in the cybersecurity industry, some people I know even outside of tech are telling me they witness similar trends.
Heterogeneous tech stacks have benefits and drawbacks and we're seeing the drawbacks play our in real-time as increased risk.
First, we can work remote. Compare that to a fast food worker who has to risk exposure to feed their family.
Second, if you spend your money right you can say f it and leave any job that's too rough.
I was actually thinking of taking 6 months off, until I realized travel is still restricted
We have a ton of software written in unsafe languages (C and C++). Our operating systems, web browsers, email readers, file editors, etc. Our governments and cyber-criminals have stock-piled 0-day exploits against these unsafe systems. They have hundreds or maybe thousands of these exploits.
On top of that businesses want features added to user-facing apps as fast as possible (beat the competition). These apps sit atop the unsafe C operating systems and often times run as root or Administrator. These apps have many logic flaws. Governments and criminals have stocked-piled 0 days against these flaws as well.
So until our systems and apps are re-written in safe languages (Go, Rust, C#. Java) and most processes run in an un-privileged context and mandatory access control (MAC) is applied to all running processes (all the time), we will continue to be hacked in ever-growing catastrophic ways.
IMO, this may be the explanation to the Fermi paradox. The aliens did themselves in with insecure software.