Tag Archives: recession

[ISN] Banks’ Concerns About Cyberthreats Grow

http://www.bankinfosecurity.com/banks-concerns-about-cyberthreats-grow-a-7486 By Tracy Kitten Bank Info Security October 28, 2014 Banking leaders say they’re substantially more concerned today than they were just six months ago about cyber-attacks and geopolitical threats aimed at the global financial system. That’s according to a report covering results of a survey conducted during the third quarter and published last week by the Depository Trust & Clearing Corp. The DTCC provides clearing and settlement services for banking institutions. Participants in the survey included financial stakeholders from throughout the world. Since March, when the DTCC last conducted its Systemic Risk Barometer survey, more global banking leaders say they see ongoing cyber-risks as posing increasing concern. They rate cyberthreats as the No. 1 systemic risk facing the global economy today. Banking institutions and other financial services firms surveyed by the DTCC say that in the past 12 months, they have increased their investments in systems and technologies designed to monitor and mitigate systemic risks, such as cyber-attacks and economic recessions that could collapse the global financial system. […]


Vulnerability Management in the cloud

Vulnerability Management - Source (ISACA.org)

While there are different stories about what cloud computing “is”,  there is one specific direction that virtualization is headed that could bring along with it some additional problems for the security industry. One issue I wanted to focus in on is centered around vulnerability management and how it is implemented in a cloud environment. Many customer’s are faced with the need to scan their cloud, but unable to do so.

Virtualization providers have been pushing their customers and hosting providers to adopt new infrastructure to automate the distribution of CPU processing time for their applications across multiple condensed hardware devices. This concept was originally conceived as “Grid-Computing” which was created to address the limits of processing power in single CPU systems. This new wave of virtualization technology is meant to automatically distribute processing time to maximize the utilization of hardware for reduced Cap Ex (Capital Expenditures) and ongoing support costs. VMware’s Cloud Director is a good example of the direction that virtualization is going and how the definition of “cloud computing” is changing.  Virtualized systems are quickly being condensed into combined multi-CPU appliances that integrate the network, application and storage systems together for more harmonious and efficient IT operations.

The vulnerability management problem:

While cloud management is definitely becoming much more robust, one issue that is apparent for cloud providers is the management of the vulnerabilities inside a particular customer’s cloud. In a distributed environment, if the allocation of systems changes by either adding or removing virtual systems/instances from your cloud you quickly face the fact that you may not be scanning the correct system for it’s vulnerabilities. This is especially important in environments that are “shared” across different customers. Since most Vulnerability Management products use CIDR blocks or CMDB databases for defining the profile for scanning, you could easily end up scanning an adjacent customer’s system and hitting their environment with scans due to either a lag between CMDB updates or due to static definitions of scan network address space.

The vulnerability management cloud solution:

My belief is that this vulnerability management problem will be addressed by the integration and sharing of asset information between the cloud and vulnerability scanning services. Cloud providers will more than likely need to provide application programming interfaces which will allow the scan engines/management consoles to read-in current asset or deployment information from the cloud and then dynamically update the IP address information before scans commence.

Furthermore, I feel that applications such as web, ftp and databases will be increasingly distributed across these same virtualized environments and automatically integrate with load distribution systems (load balancers) to ensure delivery of the application no matter where the applications move inside the cloud. The first signs of this trend are already apparent in the VN-Link functionality release as part of the Unified Computing System from Cisco however adoption has been slow due to legacy and capital deployment on account of the world’s market recession. This may even lead to having multiple customer applications being processed or running on the same virtual host with different TCP/UDP port numbers.

This information would also need to roll down to the reporting and ticketing functionality of the vulnerability management suite so that reports and tickets are dynamically generated using the most up-to-date information and no adjacent customer data leaks into the report or your ticketing system for managing remediation efforts. Please let me know your thoughts….