I recently helped out CSO magazine with a recent article on file security. The author Maria was pretty impressive - she thought about the problem of enterprises protecting data without hassling employees, and derived a lot of interesting points.
It is a good piece and is worth checking out.
Wednesday, August 26, 2015
Wednesday, August 12, 2015
Security and IoT: work to be done
A few weeks ago I was lucky enough to moderate a lively
discussion with Chris Eng of Veracode and Josh Corman of Sonatype. I have done
a bunch of these things and this one organized by Dark Reading, "The Internet of Things, the Software Supply Chain and Cybersecurity" was one of the most timely, informative, and memorable. Chris ("security faults are product defects!)") and Josh ("I
am the cavalry") are super passionate about IoT security - you should
definitely spend an hour to check it out.
IoT security has gone mainstream in the press! This story of a hijacked Corvette being just one of many recent examples. We know that smart devices are seldom built
with security in mind, and we trust the controllers in the cloud are secure,
but the truth is pretty scary.
The percentage of open source code in IoT devices is staggering.
You'll have to watch the video or read the whitepapers on Veracode or Sonatype's
web sites for the numbers.
- The first problem is that the owners of the devices don't know what vulnerabilities they are inheriting in the open source code and are building into their products. There are some interesting legislative ideas being chatted up, but really it is up to us security experts to help educate and offer solutions.
- The second problem is that these devices seldom have a patch mechanism to fix security defects. So once an exploit is discovered, like being able to remote control a car, there is no pragmatic or efficient process for correcting the fault. A secure patch mechanism has to be mandatory!
- The third problem is that the security of the cloud applications that control the IoT devices are usually put into production without a rigorous review by experienced security researchers. This is the cyber-jackpot for IoT - hacking a device is just one thing, but hacking the controller application in the cloud gives unauthorized access to all the devices. Yikes!
IoT security is a problem that extends all down the supply
chain, and has the potential to affect everyone's daily life. It is a big deal
and time for the security industry to treat it as a strategic initiative.
Friday, July 24, 2015
Security has to hustle to catch the SDN train before it leaves the station
Security has a problem with Software Defined Networking.
Organizations are embracing SDN for its adaptability to business needs, lower
acquisition costs, and potentially lower operating costs. However, there is
insufficient practical experience to guide the security industry in adequately
supporting SDN infrastructures. This results in either organizations moving
forward with SDN without waiting for security to catch up or organizations
moving forward at greater expense by shoe-horning traditional security
capabilities into SDN architectures. We feel that the time has never been
better for new network security upstarts to challenge the status quo.
The Ogren Group View
It seemed like every other keynote presentation at RSA
Conference 2015 pointed out that security has failed miserably, however it was
terribly difficult to find compelling ideas from these industry leaders as
to how to fix the security problems. Much
of the discussion focused on the existence of security product silos that do
not interact effectively with each other, and the need for organizations to try
harder.
It is our take that many of the traditional network security
silos are obsolete and over-valued, and that the network security industry
desperately needs to get ahead of the curve by adopting the principles of software
defined networking architectures. Traditional inspection and rigid perimeter
concepts will be even more ineffective in cloud-driven SDN architectures than
they are today.
The security vendors that break the mold of traditional
security, with a particular emphasis on detection and incident response/automated
remediation, will have significant security impact in the SDN world.
Opportunities
With challenges come opportunities for security vendors.
Security is typically a reactionary industry that often gets called out for battling
attackers with defenses designed for the last cyber-war. We believe it is very
clear that traditional security products will struggle to be effective in SDN
environments.
SDN, with its ability to efficiently reconfigure the
network, is a disruptive approach that requires security vendors to step up
with innovative solutions to remain relevant. We find significant potential in
companies such as Cyphort, Exabeam, Fortscale, LightCyber, TaaSera, vArmour,
Vectra, and zScaler that offer many of the characteristics of successful
SDN-oriented security companies:
Adapt with changes in the network infrastructure. Static,
monolithic products will have to become agile and flexible. For instance, IPS's
cannot expect to be on every data path and placing an IPS inside every virtual
server makes little sense. Many of the content inspection security
capabilities, such as IPS and DLP, will have to become software defined
themselves to deliver benefits to the business.
Empower analytical detection capabilities. It will become
increasingly difficult for security teams to detect the presence of attacks as
resources are automatically provisioned, upgraded, put in motion, and expired. While
organizations understand that security cannot block every attack, they do need
more help detecting attacks living within their networks. We feel that there is
room to grow for analytical and behavioral approaches that can be customized to
detect faults within complex networks.
Accelerate incident response and incident remediation. Just
as SDN offers the capability to adapt to business performance demands, so too
should SDN adapt to security incident demands. Security seems to be one of the
few industries that is excused when its products fail - organizations will seek
out vendors that make headway automating incidence response and remediation for
attacks that evade security.
Challenges
In many ways SDN is the antithesis of traditional security
concepts. The SDN approach of virtualizing control planes and data planes for a
flexible network that can adapt at the speed of business presents problems for
security vendors schooled in rigid controls and content inspections.
One large challenge is aligning security with the adaptive
nature of SDN. Software defined networks promise to dynamically shift resources
to meet business demands, even if those resources lie off premise in the cloud.
Security needs to adapt with shifting applications and network resources to
ensure acceptable coverage, prevention, detection, and remediation
capabilities.
While security prefers to inspect all content and log
everything for subsequent investigations, this comes at the price of
performance degradation if done inline. That is why most IPS and data
collectors hang off switch SPAN ports, but forcing traffic routes through security
devices becomes much more challenging with SDN. Placing security devices
everywhere is just not practical for organizations committed to an SDN
infrastructure.
Finally there is the challenge of access controls and
blocking risky applications, connections, and users. A Software Defined Network
needs to react to a dynamic business environment, effectively responding to
spikes in service demands without destabilizing the network. There simply is
not going to be much opportunity in most verticals for security teams to insert
themselves into these processes.
The Security Driven Network concept, defined as security
policies inhibiting evolution to business-enhancing Software Defined Networking,
is a dog that will not hunt for most organizations.
Get in the minds of
IT
Some of the security challenges in an SDN world revolve
around the hesitancy to deploy new security technologies. IT can be risk averse
when it comes to evaluating new architectures, especially when it comes to
security with concerns about effectiveness, loss of control, costs, and the job
continuation program if new technologies fail. It is incumbent on SDN-oriented
security vendors to educate corporate decision makers so they can act without
resorting to old ineffective bromides or the lack of compliance history as
excuses to not change, and to help justify budget line items for SDN security
proof-of-concept projects.
The industry does not understand what it means to operate a
compliant software defined network. We feel it is the vendors that must
interpret compliance standards for SDN, and in some cases form best practice
standards to help guide early adopters.
Thursday, July 9, 2015
Spikes Security innovative approach to securing browser activity
Browsers present a special problem for security-conscious
organizations. While essential as a ubiquitous interface to cloud-based
applications, browsers also provide handy interfaces for attacks to penetrate
endpoints and the network. Spikes Security is responding to this problem with a
hardware appliance that hosts browser execution in a secure environment deployed
outside the firewalls and away from the corporate network. The Ogren Group
feels this is a significant architectural approach as it affords security teams
a safe harbor for browsers, keeps attacks from spreading through the network, and
provides security teams an opportunity to secure mobile browsing activity.
When an employee launches a browsing session, a secure
connection is transparently made to the Spikes Security appliance. The
appliance fires up a virtual image of the browser which executes in hardware-enforced
isolation. The vendor promises that attacks cannot leap out of isolation to
infect the network or other browsing sessions hosted on the appliance. It is a
clever idea which also offers these benefits:
- Secure user browsing sessions, particularly those on smartphones and tablets, through a corporately supported security device without the hassles of managing endpoint software. This is huge, as IT can offer users heightened endpoint security that is transparent to browsing activity and offers a point of on-premise focus for securing cloud activity.
- Scan all downloads for known threats and audits mobile use of corporate resources. The IT supported appliance makes it easier to block infected downloads before the file reaches the endpoint.
- Accelerate the timeline for receiving the security advantages of hardware isolation to retard the spread of an attack without having to refresh PCs, wait for Windows upgrades, or offer software solutions for mobile devices.
Spikes Security is a new vendor so the Ogren Group
recommends some practical prudence in evaluating the solution with real users. In
addition to the usual growing pains of new products, there are specific issues that
enterprise buyers must address during the proof of concept. These include:
- Ensure that users do not disable browser settings directing traffic to the security appliance. There will always be users that do not want security teams having visibility into their browsing activity - these users will be noticeable by their absence from the activity logs.
- Assure users that their browsing privacy is not being invaded. Use auditing responsibly - only look at browser access to corporate applications, ignore personal browsing activity and keep users on your side.
- Evaluate the number of concurrent browsing sessions in your organization to plan for the proper number of Spikes Security appliances, and be sure to understand the user impact if browsing demands exceeds appliance capacity.
The Ogren Group believes this is a neat architectural
approach for organizations relying on cloud-based applications - and every
organization has a cloud-based application strategy. Spikes Security is a
promising vendor that, with proper execution, can help organizations protect against
browser-borne infections and confidential data loss.
Friday, June 26, 2015
CIO/CISO Summit Boston
We all have our pet peeves when it comes to security
technologies and practices. One of mine is the hesitancy of security
practitioners to openly share their experiences lest they expose a serious
weakness to the world. Trust me, if you have large vulnerability issues the bad
guys already know about it! You are much better off talking with peers in your
industry to find out what works for them so you can learn from them and make
progress.
The healthy exchange of security ideas from enterprise
leaders was one of the reasons I was excited to be invited to the CIO/CISO
Summit held last week in Boston. This conference provides an opportunity for
CIO/CISOs to participate in roundtable discussions, absorb highlights from
presentations and otherwise network with peers. It is an inspirational idea
that seems like time well spent.
A few points resonated with me from one of the sessions:
Strong
security processes can lead to user fatigue, and user support is critical for
security. We sometimes overlook the impact security decisions can have
on our people. A CPA friend bemoans that accounting guidelines call for a
separate un-memorizable password for each and every client and that passwords
must be regularly changed. So of course he writes them all down in multiple
obvious places so he can work from home or office ... and now has a jaundiced
view of IT security recommendations. Ridiculous. If you are on a security team,
be sure to consider the impact on users and avoid being invasive "for
security's sake" whenever possible.
Technology
is important, but be careful of a false sense of security. Every
vendor promises the next great thing, and many products are indeed great. But
no product does everything great. SIEMs are cool, but your data is already lost
by the time any event is recorded; user analytics are a fascinating approach to
detecting the presence of malware, but there will be false positives in
anything based on behavioral scoring. One takeaway is that fundamentals are
fundamental for a reason - be sure to have and enforce standard operating
environments for important servers, efficient processes to patch critical
vulnerabilities, documented processes to rebuild after attacks, and only run
the latest releases of software. Killer technology is best when supplementing a
security program focused on strong fundamentals.
Reserve
more of your security budget to embrace new user activity. In the
last 5 years where has your organization changed the most and how is your
security program adjusting to those changes? This is a difficult question because
security likes to control stable processes and technology, but things cannot
always stay the way they are. This may mean that capabilities you fought hard
for just a few years ago are suddenly worth a lot less to you now (can you say
MDM?). It is easy to see the growth of the cloud and mobile devices so instead
of trying to force them to behave like your physical infrastructure isn't it
more pragmatic to have security get ahead of user activity? Be willing to
change security processes with the times - even if it means leaving good legacy
stuff behind.
I was impressed with the program put together by the CIO/CISO
Summit. I am hoping that your region has something similar. While it is always
nice to socialize with peers, it is even better to have challenging security
conversations.
Wednesday, May 20, 2015
Research calendar for 2015
My post-RSA research is moving along at a nice accelerating
pace! After being laid up for far too long, I have set an ambitious 2015 plan
intending to cover:
- User controls and behavior analytics,
- Next generation endpoint security,
- Securing virtualized infrastructures,
- Re-imagining file security, and
- Advances in practical network integrity monitoring
My user controls and behavior
analytics report is well underway
with several vendor briefings and a few background-only enterprise briefings
already completed with a June publish date targeted. As usual, my security
segment reports always come up with interesting trends - what started out as a
"protect the business against unauthorized privileged insider activity"
has become more of a "protect the business against malicious threats via inappropriate
user behavior detection". Makes sense in that security must detect malware
grabbing a user's credentials and enterprises always have more budget for
anti-malware provisions than for controlling users. Stay tuned as I did deeper
into some clever innovations that every security team should be evaluating.
Along the same line, vendors are looking at re-imagining file
security in light of malware protection and the evolution towards cloud
architectures. It is stunning to think in these days of disclosing sensitive
data loss that none of the primary security technologies - firewalls, antivirus,
IPS, IAM, SIEM - have any concept of file security! The best you can do is to
control access to servers, but honestly you cannot control where your files go
once they reach a remote PC. Fortunately, there are vendors worrying about what
happens to your files once they travel beyond the firewalls. There are some
excellent concepts discussed by SC Magazine and FinalCode in a May 21st webcast that you may find interesting.
A lot of vendors are scrambling for a category to detect
attacks that evade classical signature-oriented defenses. I am quite enthused
about the next generation endpoint players and about those looking at the
problem from a network integrity viewpoint. Lots seem to be scrambling towards EDR
even though nobody, including Gartner, really knows what EDR is. So I'll take a
crack at defining next generation endpoint security and network integrity with
an eye to solving specific enterprise problems that cannot be solved via
classical methods.
Finally, let's hope that the government actually does the
right thing by restricting NSA cyber activities, and that the NSA stops
treating laws like Massachusetts drivers treat yellow lights. Just because you
can eavesdrop, collect data on private conversations, and develop malware
attacks doesn't mean that you should. Mother's Day just passed - maybe the NSA
got an earful from their moms on how to behave?
Friday, May 1, 2015
Hard to believe that it has been almost a week since I got
home from RSA 2015! It was a whirlwind week reconnecting with friends and
having fascinating security discussions at every turn. Security is riding high
and this had to be the largest RSA Conference yet whether measured by numbers
of attendees or exhibitors!
Of course, there is always a mix of experiences so here is
my brief recap of the highs and lows of the week.
Things that made me smile:
- The new breed of network security vendors, including Cyphort, Elastica, Lastline, and TaaSera taking dead aim on detecting malware within enterprise networks. They take different approaches, but all of them combine intelligent analysis of network and endpoint behavior to fill in the blanks between AV and IDS systems. Neat stuff!
- FIDO and smartphone based authentication systems that elevate the prospect of widespread consumer by distributing proof of identity to remote devices. There are so many phones and devices that a person carries, that separately purchased and managed security tokens are becoming less and less appealing. I talked with Identiv, Keypasco, Nok Nok, and RSA VIA at the show and came away from each excited about the future direction of authentication.
- A shout out to the RSA Conference itself for their edict banning booth babes. It seems like more than a few sharp female security professionals were being treated as if they were at the conference only for their looks, and of course others were there only for their ability to flaunt their curves and swipe badges. The conference committee put an end to that practice and the RSA experience was far far better as a result. Two thumbs up there!
- Best parties? I bumped into a lot of folks at the Qualys event, and scored an autographed copy of Brian Kreb's Spam Nation for reading on a rainy New England day. I also had a great time clubbing with Royal Blood, thanks to vArmour, where it was dark enough that only a few close to me could laugh at my excuse for moves :).
Things that made me pause:
- "Security is broken". I must have heard this a hundred times throughout the week, from CEOs to demo engineers. Unfortunately, most of the people saying "security is broken" followed it up with an upbeat description of their product's 5.2 dot release - which hasn't moved the security bar in years. I would have liked hearing more innovative attempts to fix security's problems.
- I can't help but feel that the attention paid to Threat Intelligence is the best example of how broken security is. Think about it - it is basically tossing threat information to enterprise security and telling them to go protect themselves. Seems it is our job as security professionals to analyze security threats, protect the enterprise, and help them recover when an attack inevitably breaks through.
- My federal government tax dollars at work. I honestly do not see how the government believes it should be in the consumer security business, and nothing has shown me that they can do the job even adequately. Yet, there were booths in the exhibit halls by the DHS, DHS Science & Technology, FBI, Federal Reserve Bank, NSA and Treasury. I get that DHS is the leading vertical for many security vendors, money talks and lively discourse is good. But wouldn't we be better served if the government figured out how to prosecute cyber-thieves, established national disclosure policies, and educated enterprises on investigative requirements for incident response plans?
Saturday, April 18, 2015
Endpoint Monitoring is a Strategic Imperative for Business Operations
Invincea is starting off what promises to be an exciting RSA
2015 with its Advanced Endpoint Protection announcement, and I am looking forward to catching up with the latest RSA ECAT, Bromium vSentry, Cybereason, and more in the endpoint security space. (Also keen on a few others, but that is for next week :). Here is something I wrote a while ago that still reflects my thinking today!:
Continuous endpoint monitoring has become a strategic
imperative for many security organizations. Modern attacks designed to extract
confidential information modify endpoint software, reconnoiter your network
looking for exploitable weaknesses, and connect to externally-sourced servers
to deliver your secrets. The inevitable result is a labor intensive
investigation to detect infected systems and a costly recovery process that
impacts the business. If you are not continuously monitoring your endpoints,
servers and connected user devices, then you will not have the intelligence to
rapidly detect attacks within your perimeter and expeditiously restore normal
business operations.
You have all invested in the latest pattern-matching cyber-security
defenses to prevent attacks from penetrating the network. Traditional
anti-malware is a required fundamental, but is proven to be incapable to
preventing threats and cleaning up after an infection. In fact, it is difficult
to determine what constitutes best-of-breed anti-malware and many of you base
purchase decisions on price and business relationships knowing that you need to
check the compliance box and that it leaves large gaps in your cyber-security
practice that you must account for.
CISOs are now expected to improve operational performance in
detecting security incidents and to reduce the time and energy required to
return infected devices to a secure state after the detection of an attack.
This strategic imperative to integrate detection of and recovery from security
events with business operations drives demands for effective monitoring of
servers and user endpoints. You will also find organizational benefits of
security utilizing endpoint intelligence to better integrate cyber-security
with IT teams.
The main features of a continuous endpoint monitoring
program include:
·
Automate
behavioral approaches to monitor changes in configurations, network usage,
memory utilization. All attacks leave traces that can be detected such as insertion
of attack logic into executable code in memory or in persistent storage,
probing of your network in search of vulnerable endpoints that can join the
attack or host confidential information that can be monetized by the intruder, communicate
with external application services and IP addresses to pilfer your electronic business information
assets. Deploy endpoint monitoring to detect unauthorized changes to your
infrastructure that may indicate the presence of an attack.
·
Use
endpoint monitoring to help you confirm changes to your security policy,
including deployment of software upgrades and patches, and retirement of
obsolete or insecure software. While endpoint monitoring solutions analyze
endpoints for the presence of infections, the process also arms you with independent
intelligence on actual software configurations. Information on where executable
programs are installed in your network can prove invaluable when it comes time
to plan and launch attack investigations and cleanup operations. You get what
you inspect, not what you expect.
·
Endpoint
monitoring, a single source of
information on software and network activity, becomes a focal point for the
integration of security with business operations. The business integration
values of a continuous endpoint monitoring program go well beyond enhancing
operational security performance for detecting cyber-threats and returning to a
compliant business. IT organizations, such as end-user service desks, application
services, network management, and quality assurance increasingly use security
monitoring technologies as a go-to source of real-time information of what is
actually happening on endpoints. They do this because endpoint monitoring
reduces errors and makes their jobs easier. You will find IT colleagues using
your endpoint monitoring solution to quickly gather the information they need
to maintain the infrastructure.
You know you need an automated system that can efficiently
and cost-effectively allow you to detect infections before your customers
report them, and accelerate recovery procedures to restore a compliant
business. Your peers in other organizations are utilizing endpoint monitoring
tools as a strategic imperative for operating a secure business. If you are not
leveraging endpoint monitoring in your security practice, this should rise to
the top of your priority list for 2015.
Wednesday, April 1, 2015
RSA is approaching - check out firewall analysis vendors
Anyone managing their corporate firewalls without the use of
modern analysis tools is committing security malpractice.
Every good security program starts with firewalls and the
ability to control network access to critical resources. However, firewalls are only as effective as
the set of rules defining communication access policies. While it is easy to
know when firewalls block legit access to applications - users call up the
service desk and complain - the bigger problem is it is nigh impossible to
detect when firewall rules inadvertently create broad access to your network .
The risk of enticing security incidents via gaping holes in
your network security are just too great to ignore. Ferreting out holes in your
firewall security requires a thoroughness and attention to detail that only an
automated product can provide. It is just asking too much of your best security
expert to find errors of omission and to prove negatives.
The good news is that firewall analysis tools are mature and
are effective. While they are first and foremost security products, you will
find many time saving benefits in helping you manage complex applications,
network reconfigurations, and evolution to virtualized data centers. Any of the
primary vendors will have references that you should talk with to better
understand the benefits.
There are some fine firewall analysis products out there
including (alphabetically) AlgoSec, FireMon, Solarwinds, and Tufin. RedSeal and Skybox
provide more network path analysis, but are also worth knowing about. If you
have any degree of network complexity, then go get one of these tools now. Consider
it an always-on rule.
Tuesday, November 26, 2013
Last week's security vibes
It has been quite a while! Let me recap selected security
news from vendors I’ve talked with in the past couple of weeks to get up to
date with current events. In most cases I had to wait for their embargoes to
lift – my apologies if I have announced anything early J!
Palo Alto Networks and VMware announce that next generationfirewalls from PAN will embrace VMW’s NSX to secure traffic between virtualmachines as well as between virtual data centers. I thought this was great as it allows vCenter to orchestrate application security
policy both within and between perimeters. In the long run, with software
defined networks, security policy will have to travel with the application to
be enforced locally. This agreement nicely positions Palo Alto and VMware toadd much needed flexibility in securing applications as they evolve from
physical to virtual to cloud environments. Love this one!
NetCitadel announced ThreatOptics to enhance an organization’s ability to respond to incidents. What I like about
the vision is that instead of layering analytics on mountains of SIEM data,
NetCitadel kicks off when a network sandbox such as FireEye or Palo Alto Networks
WildFire reports an anomaly. ThreatOptics then reaches out to affected
endpoints with a dissolvable agent to grab detailed host information that it
can then combine with what the network sees to give security organizations better
intelligence to prioritize and respond to incidents. I believe that launching
investigations based on observed suspicious behavior is a concept with impact –
it will be fun to watch NetCitadel run with it!
Mojave Networks, nee Clutch Mobile, is stepping beyond mobile
device management to offer a cloud-based security service for mobile devices.
This makes perfectly intuitive sense to me - as most of the action for mobile and
tablet devices takes place in the cloud that’s where security should be!
Dumping a lot of security apps onto your device can’t be the right approach
with issues of battery life, compatibility with popular applications, and
constant upgrades. I like where Mojave is going and the team they’ve assembled.
I wish they would extend their focus beyond small and medium enterprises to
address larger security concerns of larger enterprises, but the market will
soon speak to that.
Adallom is a freshly
launched company with a clever idea to protect SaaS applications. It is a tough
problem as IT needs to protect the business, but does not need to get involved
in personal use issues. The Adallom solution piggybacks on the identity process
to audit cloud activity and implements heuristic profiles similar to those that
have proven successful in detecting credit card fraud. The company still needs
to execute, but they have a great idea and experienced leadership so I look for
more from this exciting company as they move forward!
Prelert announced Anomaly Detective 3.0, a special Splunk
application
that, based on learning a machine’s and network’s normal behavior, promises to
reduce a high volume of security alerts to an actionable level of incidents. It
is an interesting approach to combat the flood of data and alerts that security
teams now have to deal with. I like Splunk a lot (as do lots of others) partly
because of the balance it strikes in delivering value to both IT and security
operations. It looks like Prelert is going to stick to its security roots, but
the Splunk bandwagon is a good one to hitch onto.
Thursday, October 31, 2013
Happy to be blogging with Computerworld (again)!
Every now and then you get lucky enough to be able to correct an unfortunate decision. For some time I had enjoyed posting my thoughts, opinions and recommendations on Computerworld’s security blog, building up a nice following in the process.
Thanks to the kindness of Computerworld’s editorial team,
who honestly already gets an amazing amount of work done, I have returned to
the beat with weekly contributions. I am looking forward to the coming year and
I hope to see you there!
The first posting talks about threat reports and I hope you enjoy it:
Friday, June 7, 2013
Decision time! Choosing the right firewall analysis approach for you
I am hearing feedback that organizations are checking out Firewall Analysis vendors to save time satisfying firewall change requests and to increase the security quality of each change (e.g. reducing errors that create gaping holes or disrupt application services). These organizations get the operational efficiency benefits, but are unsure whether to prioritize application-oriented Firewall Analysis or threat path-oriented Firewall Analysis, to use terms from my recent Firewall Analysis Saves Time Keeping Application Paths Clear report.
The answer is very clear: firewalls are in place to secure access to applications. That is job 1. Period. You should be prioritizing your evaluations on application-oriented Firewall Analysis solutions because that is what your business most needs right now and in the coming years. Firewall changes are driven by the demands of users and applications – it is simply practical to align Firewall Analysis criteria to meet these demands.
You would think the application-oriented issue would be the thousands of applications organizations need to make accessible to a large mobile user community. And to a certain extent you would be right as users will feel the brunt of application service disruptions. Ironically enough, it is the back-end complexity of applications that causes the security headaches. Applications now consist of connected web servers, databases, application engines, load balancers, and gateways – many of which are transient virtual images, deployed in corporate data centers, or distributed throughout the cloud. Leveraging your understanding of the relationships of the entire application environment in maintenance of firewall rule sets is critical in creating effective rules and in avoiding creating over permissive access which could leave a security hole to critical resources undetected for far too long a time. Application-oriented Firewall Analysis will help you secure the entire application environment and save you considerable administrative time.
On the other hand, I am not a big fan of threat path analysis as part of a vulnerability management strategy. And I want to be. The premise is that you do not have to patch key vulnerabilities in servers if threats cannot reach those vulnerabilities, or must pass through an IPS in transit. It is a security form of alchemy, sounding obvious and beautiful until you think it through a little bit. No matter how many hops you analyze, attacks are going to defeat pattern-matching security filters and somehow reach a vulnerable server tucked away in the darkest corner of your data center. They always do. And now your threat path analysis is worthless. Can you imagine your CISO telling the board that vulnerability management of sensitive servers was given a low priority because a threat path analysis vendor told him/her that an attack could not reach those servers? Me neither and I have yet to talk with any enterprise security executive that thinks this is a valid approach. Threat path analysis can help you audit your network for security AV and IPS filters, or trace where an attack may have leapt to from an infected server, but will only be able to help you once or twice a year. For vulnerability management however, I don’t buy it and neither should you.
Just think about what your most common firewall related help desk tickets say. Probably “I cannot access my application” or “My application performance is terrible” top the list. You are probably not getting a lot of requests to leave serious vulnerabilities in critical servers un-patched. If you must go with threat path-oriented vendors, know that you will only use them once a year or so to make sure all network segments pass traffic through security filters. My advice to you is to start with a strategy of application-oriented Firewall Analysis that will protect your business, keep users happy, and save you time every single day.
The answer is very clear: firewalls are in place to secure access to applications. That is job 1. Period. You should be prioritizing your evaluations on application-oriented Firewall Analysis solutions because that is what your business most needs right now and in the coming years. Firewall changes are driven by the demands of users and applications – it is simply practical to align Firewall Analysis criteria to meet these demands.
You would think the application-oriented issue would be the thousands of applications organizations need to make accessible to a large mobile user community. And to a certain extent you would be right as users will feel the brunt of application service disruptions. Ironically enough, it is the back-end complexity of applications that causes the security headaches. Applications now consist of connected web servers, databases, application engines, load balancers, and gateways – many of which are transient virtual images, deployed in corporate data centers, or distributed throughout the cloud. Leveraging your understanding of the relationships of the entire application environment in maintenance of firewall rule sets is critical in creating effective rules and in avoiding creating over permissive access which could leave a security hole to critical resources undetected for far too long a time. Application-oriented Firewall Analysis will help you secure the entire application environment and save you considerable administrative time.
On the other hand, I am not a big fan of threat path analysis as part of a vulnerability management strategy. And I want to be. The premise is that you do not have to patch key vulnerabilities in servers if threats cannot reach those vulnerabilities, or must pass through an IPS in transit. It is a security form of alchemy, sounding obvious and beautiful until you think it through a little bit. No matter how many hops you analyze, attacks are going to defeat pattern-matching security filters and somehow reach a vulnerable server tucked away in the darkest corner of your data center. They always do. And now your threat path analysis is worthless. Can you imagine your CISO telling the board that vulnerability management of sensitive servers was given a low priority because a threat path analysis vendor told him/her that an attack could not reach those servers? Me neither and I have yet to talk with any enterprise security executive that thinks this is a valid approach. Threat path analysis can help you audit your network for security AV and IPS filters, or trace where an attack may have leapt to from an infected server, but will only be able to help you once or twice a year. For vulnerability management however, I don’t buy it and neither should you.
Just think about what your most common firewall related help desk tickets say. Probably “I cannot access my application” or “My application performance is terrible” top the list. You are probably not getting a lot of requests to leave serious vulnerabilities in critical servers un-patched. If you must go with threat path-oriented vendors, know that you will only use them once a year or so to make sure all network segments pass traffic through security filters. My advice to you is to start with a strategy of application-oriented Firewall Analysis that will protect your business, keep users happy, and save you time every single day.
Tuesday, June 4, 2013
Webinar: Achieving Continuous Diagnostics & Monitoring
Clear your calendar for tomorrow afternoon's ForeScout webinar based on the government's CDM initiative!
The federal government has created budget for agencies to step up to the challenge of continuous security. I refer to it as continuous compliance, but someone smarter than me saw the potential for a TLA (aka three letter acronym). It is an interesting topic and a good chance to talk about how the network is driving real-time vigilance of the infrastructure. I hope you can listen in at 2:00ET/11:00PT.
Tomorrow will also be the anniversary of Tiananmen Square's unknown rebel. The cyber-security metaphors are just too good to pass up. I start by thinking of the unknown rebel as a CSO :).
The federal government has created budget for agencies to step up to the challenge of continuous security. I refer to it as continuous compliance, but someone smarter than me saw the potential for a TLA (aka three letter acronym). It is an interesting topic and a good chance to talk about how the network is driving real-time vigilance of the infrastructure. I hope you can listen in at 2:00ET/11:00PT.
Tomorrow will also be the anniversary of Tiananmen Square's unknown rebel. The cyber-security metaphors are just too good to pass up. I start by thinking of the unknown rebel as a CSO :).
Sunday, May 12, 2013
Firewall Analysis Saves Time Keeping Application Paths Clear report is out!
The Firewall Analysis Saves Time Keeping Application Paths Clear report is complete and available!
Please contact me for more info. The teaser is:
Firewalls rely on IT-defined rules in allowing authorized application traffic to flow unencumbered between data centers and users while preventing undesirable traffic from entering the corporate network. These rules, which can number in the thousands per firewall, prescribe allow/deny decisions based on sources, destinations, and the services provided. The more complex the network, the more complex the firewall rule sets, and the more likely IT will encounter disruptive side-effects when changing firewall rules to secure application access.
The primary reason to analyze firewall rule sets is to identify logic errors opening security gaps, violating compliance policies for segmenting regulated data, preventing subsequent rules from firing, or rules becoming obsolete due to changes in business services. This leads to business benefits in managing network complexity such as:
• Drive operational costs out of making changes to firewall rule sets by reducing errors, automating compliance reporting, and recommending effective rules based on application requirements.
• Accelerate application deployment cycle times by streamlining firewall change processes to a matter of hours.
• Enable an orderly evolution to application-centric security management for next generation firewalls as well as traditional deployed firewalls.
• Model the impact of new rules before a change is approved to protect against errors that could block application paths.
• Maintain a secure audit log of firewall rules changes to document all changes for compliance reporting.
Firewalls connect businesses to the Internet. It is the one security technology that truly enables a stronger business by securing application paths to users. The Ogren Group believes it is critically important for organizations to apply technology to help manage accuracy and instill a change process to control operating costs with increasing complexity in networks and firewall rule sets.
It is far from certain that firewall analysis will be more than a niche market with room for multiple vendors. Firewall analysis vendors are branching into application security motivated by next generation firewall concepts, enterprise security management to reduce operational costs, and threat assessment based on path analysis. The Ogren Group applauds AlgoSec, SolarWinds and Tufin for their vision and execution in Firewall Analysis.
In this report, the Ogren Group presents the features, life cycle, and market strategy of Firewall Analysis. The report concludes with recommendations for vendors and the enterprise buyers they covet.
Please contact me for more info. The teaser is:
Firewalls rely on IT-defined rules in allowing authorized application traffic to flow unencumbered between data centers and users while preventing undesirable traffic from entering the corporate network. These rules, which can number in the thousands per firewall, prescribe allow/deny decisions based on sources, destinations, and the services provided. The more complex the network, the more complex the firewall rule sets, and the more likely IT will encounter disruptive side-effects when changing firewall rules to secure application access.
The primary reason to analyze firewall rule sets is to identify logic errors opening security gaps, violating compliance policies for segmenting regulated data, preventing subsequent rules from firing, or rules becoming obsolete due to changes in business services. This leads to business benefits in managing network complexity such as:
• Drive operational costs out of making changes to firewall rule sets by reducing errors, automating compliance reporting, and recommending effective rules based on application requirements.
• Accelerate application deployment cycle times by streamlining firewall change processes to a matter of hours.
• Enable an orderly evolution to application-centric security management for next generation firewalls as well as traditional deployed firewalls.
• Model the impact of new rules before a change is approved to protect against errors that could block application paths.
• Maintain a secure audit log of firewall rules changes to document all changes for compliance reporting.
Firewalls connect businesses to the Internet. It is the one security technology that truly enables a stronger business by securing application paths to users. The Ogren Group believes it is critically important for organizations to apply technology to help manage accuracy and instill a change process to control operating costs with increasing complexity in networks and firewall rule sets.
It is far from certain that firewall analysis will be more than a niche market with room for multiple vendors. Firewall analysis vendors are branching into application security motivated by next generation firewall concepts, enterprise security management to reduce operational costs, and threat assessment based on path analysis. The Ogren Group applauds AlgoSec, SolarWinds and Tufin for their vision and execution in Firewall Analysis.
In this report, the Ogren Group presents the features, life cycle, and market strategy of Firewall Analysis. The report concludes with recommendations for vendors and the enterprise buyers they covet.
Subscribe to:
Posts (Atom)