Creating a New Cybersecurity ‘Norm’
In my previous post, I wrote about COVID-19-related cybersecurity challenges and adjustments. Most of the immediate changes were tactical in nature, to handle the situation at hand. However, COVID-19 brought behavioral changes in the way we work and consume technology. Companies now understand that working from home is possible and better understand its limitations and benefits. Consumers are consuming services remotely – shopping, banking, health, to name a few – and they will learn to expect their services to be remote.
Now that the “new normal” has arrived, companies are adapting their offerings and network capacity and making sure that the services they provide are secured. Some industries will move to be completely digital, while some will merely increase the portion of their remote service in their overall hybrid service approach.
It is essential to stop and think about how these adjustments affect an organization’s cybersecurity processes and strategy. The behavioral changes added many blank pages to the business playbooks. It is now critical to adjust the security processes – a good process will allow the people to optimize the technology and use it in the right way while keeping the organization secured.
One of the major changes in recent months was related to cloud transformation – many organizations experienced the cloud like never before and grew accustomed to it. They discovered more cloud advantages and accelerated their cloud transformation. Cloud migration and hybrid cloud usage, which in the past were perceived as long processes that would take many months and years, were accelerated to mere weeks and months to support the demand for resources.
The main challenge in the cloud is how to verify the configuration and monitor activity. Each cloud vendor gives its own options for configuration and monitoring. It is preferred to have one comfortable place to check the security configuration, monitor the activity, and take actions when needed. Here are the aspects of a good cloud security system:
- Identity verification – the system should monitor credential exfiltration attempts, credential stuffing, etc.
- Least amount of access – the system should check permission per user, user-groups, and user roles and alerts on extra privileges and privileges that are not used.
- Micro-segmentation and access – the system should monitor segmentation access. Other than internal segments, the system should monitor access to assets outside of the cloud and cloud-native assets that are not part of the defined user segments.
- API monitoring – APIs are a big part of any cloud deployment, and the system should track their access and usage. The system should learn APIs permission and usage patterns and alert on anomalies.
- Behavior analysis – with many configurations, users, and workloads spread across the cloud, the right security system should learn the typical behavior of the system parts and alert on change of such behavior.
- What to track – the system should track all the above for users, machines, storage, and FAAS (lambda, etc.).
- How to alert – with all the above monitoring, the number of logs and alerts is vast. The system should report a clear attack-story that users can investigate and react upon and provide an easy automatic reaction mechanism.
- Multi-cloud – organizations are using multiple cloud vendors these days, whether by choice or by evolution. With each cloud vendor offering different interfaces, the system should work on multiple clouds and provide one pane of glass for monitoring and one familiar interface for all of them.
Working from home raises the importance of a zero-trust model in the organization. When allowing remote access to the organization network, zero-trust becomes critical and must be part of the organization strategy. The model contains:
- Identity verification and MFA – every person and device connecting to its network should prove its identity and get permission to connect. Commercial (or open-source) identity services should be used, with multi-factor-authentication to ensure the identity further.
- Micro-segmentation – when the organization allows remote access, it is important to control who can access what. Data should be analyzed for value and importance, and the valued data should be kept in its own segment with well-defined access rules. Networks should be analyzed and segmented, and access to sensitive machines should be controlled.
- Least amount of access – Each user should get the least amount of permission they need.
- Access to sensitive networks and data should be monitored, and alerts should raise and handled once unauthorized access is detected.
Since cloud migration is usually newer, it is easy to start the zero-trust model deployment in the cloud. Zero-trust works in the cloud, just like on-prem, with the aspects mentioned above.
Attack Surface for Remote Services
With interactions happening remotely, it is critical to give more attention to the attack surface of the remote applications.
When organizations initially developed remote services, they were a small portion of the overall service, and in many cases, were considered a backup plan. As a result, developers didn’t always have security in mind. Now that remote applications are the foundation of the customer experience and as their usage increases, organizations must revisit their security measures.
Take telemedicine, for example. In the past months, as telemedicine usage boomed, so did its value for attackers. Telemedicine applications contain Personally identifiable information (PII), medical history, private images, etc. and they are a treasure for cyber criminals. Companies such as telemedicine providers, and any other remote applications, are now investing in their security.
What should security assessment include?
- DDoS protection
- Application protection
- Data protection – both data at rest and in motion – encryption, exfiltration monitoring
- Bot protection
- API protection
Any organization must make sure its primary sources of revenue are well covered to avoid downtime and breaches.
Security Technology Consumption Changes
Companies adjusting to remote work must also adjust their security portfolios. Most companies use their existing vendors to scale, as it is already proven and easier to use – known technology with already existing processes.
However, security products are known for their complexity and while remote training on the technology is possible, it is harder; the user will need to work without an expert standing next to them, explaining every button and option. The changing economy might also cause a budget shortage – with less budget to hire, fewer people will be available to work and manage security solutions.
As a result, companies need security vendors who provide managed services to ensure that their deployed security technology is optimized to their needs. Most security systems need occasional adjustments as best practices change, and as organizations undergo changes. A managed service is a win-win solution, as it allows experts to manage technology while allowing companies to focus on their core business.
As we are learning to live with the pandemic, companies are adjusting to the new norm. Now is the time to increase attention paid to security technology. Securing access to the work, securing infrastructure – both existing and new, on-prem and in the cloud – is the new priority and is critical to ensuring service availability and business continuity.