Continuing on from my last post titled Security and Disruptive Innovation Part II: Examples of Disruptive Innovation/Technology in the Security Space we’re going to finish up the tour of some security-specific
examples reflecting upon security practices,
movements and methodologies and how disruptors, market pressures and
technology are impacting what we do and how.
16. Software as a Service (SaaS)
SaaS
is a really interesting disruptive element to the traditional approach of
deploying applications and services; so much so that in many cases, the
business has the potential to realize an opportunity to sidestep IT and
Security altogether by being able to spin up a new offering without involving either group.
There’s no complex infrastructure to buy and install, no obstruction to the business process. Point, click,
deploy. The rationalization of reduced time to market, competitive
advantage and low costs are very, very sexy concepts.
On the one hand, we have the agility, flexibility and innovation that SaaS
brings but we also need to recognize how SaaS intersects with the
lifecycle management of applications. The natural commoditization of software
functionality that is yielded as a by-product of the "webification" of
many of the older applications make SaaS even more attractive as it offers a more cost-effective alternative. Take WebEx, Microsoft Live, Salesforce.com and Google Apps as examples.
There are a number of other interesting collision spaces that impact information security. Besides issues surrounding the application of general security controls in a hosted model, since the application and data are hosted offsite, understanding how and where data is stored, backed up, consumed, re-used and secured throughout is very important. As security practitioners, we lose quite a bit of visibility from an operational perspective in the SaaS model.
Furthermore, one of the most important issues surrounding data security and SaaS is the issue of portability; can you take the data and transpose its use from one service to another? Who owns it? What format is it in? If the investment wager in service from a SaaS company does not pay off, what happens to the information?
SaaS is one of the elements in combination with virtualization and utility/grid computing that will have a profound impact on the way in which we secure our assets. See the section on next generation centers of data and information centricity below.
17. Virtualization
Virtualization is a game-changing technology enabler that provides economic, operational and resilience benefits to the business. The innovation delivered by this disruptor are plainly visible.
Virtualization in servers today allow us to realize the first of many foundational building blocks of future operating system architectures and next generation computing platforms such as the promises offered by grid and utility computing models.
While many of the technology advancements related to these "sidelined futures" have been in the works for many years, most have failed to grasp mainstream adoption because despite being technically feasible, they were not economically viable. This is changing. Grid and utility computing is starting to really take hold thanks to low cost compute stacks, high-speed I/O, and distributed processing/virtualization capabilities.
Virtualization is not constrained to simply the physical consolidation of server iron; it extends to all elements of the computing experience; desktops, data, networks, applications, storage, provisioning, deployment and security.
It’s very clear that like most emerging technologies, we are in the position of playing catch-up with securing the utility that the virtualization delivers. We’re seeing wholesale shifts in the operationalization of IT resources and it it will continue to radically impact the way in which we think about how to secure the assets most important to us.
In many cases, those who were primarily responsible for the visibility and security of information across well-defined boundaries of trust, classification, and distribution, will find themselves in need of new methods, tools and skillsets when virtualization is adopted in their enterprise.
To generally argue whether virtualization provides "more" or "less" security as compared to non-virtualized environments is an interesting debate, but one that offers little in the way of relevant assistance to those faced with securing virtualized environments today.
Any emerging technology yields new attack surfaces, exposes vulnerabilities and provides new opportunities related to managing risk when threats arise. However, how "more" or "less" secure one is when implementing virtualization is just as subjective a measurement which is dependent upon business impact, how one provisions, administers, and deploys solutions and how ultimately applies security controls to the environment.
Realistically, if your security is not up to par in non-virtualized, physically-isolated infrastructure, you will be comforted by the lack of change when deploying virtualization; it will be equally as good…
There are numerous resources available now discussing the "security" things we should think about when deploying virtualization. You can find many on my blog here.
18. De-/Re-Perimeterization
This topic is near and dear to my heart and inspires some very passionate discussion when raised amongst our community.
Some of the reasons for heated commentary come from the poor marketing of the underlying message as well as the name of the concept.
Whether you call it de-perimeterization, re-perimeterization or radical externalization, this concept argues that the way in which security is practiced today is outdated, outmoded and requires a new model that banishes the notion that the inside and outside of our companies are in any way distinguishable today and thus our existing solutions are ineffective to defend them.
De-/Re-perimeterization does not mean that you should scrap your security program or controls in lieu of a new-fangled dogma and set of technology. It doesn’t mean that one should throw away the firewalls so abundantly prevalent at the "perimeter" borders of the network.
It does, however, suggest you should redefine the notion of the perimeter. The perimeter, despite its many holes, is like a colander — filtering out the big chunks at the edge. However, the problem doesn’t lie with an arbitrary line in the sand, it permeates the computing paradigm and access modalities we’ve adopted to provide access to our most important assets.
Trying to draw a "perimeter" box around an amorphous and dynamic abstraction of our intellectual property in any form is a losing proposition.
However, the perimeter isn’t disappearing. In fact, I maintain that it’s multiplying, but the diameter is collapsing.
Every element in the network is becoming its own "micro-perimeter" and we have to think about how we can manage and secure hundreds or thousands of these micro-perimeters by re-thinking how we focus on solving the problems we face today and what those problems actually are without being held hostage by vendors who constantly push the equivalent of vinyl siding when the foundations of our houses are silently rotting away in the name of "defense in depth."
"Defense in depth" has really become "defense in width." As we deploy more and more security "solutions" all wishing to be in-line with one another and do not interoperate, intercommunicate or integrate, we’re not actually solving the problem, we’re treating the symptoms.
We really need endpoints that can self-survive in assuredly hostile environments using mutual authentication and encryption of data which can self-describe the nature of the security controls needed to protect it. This is the notion of information survivability versus information security.
This is very much about driving progress through pressure on
developers and vendors to produce more secure operating systems,
applications and protocols. It will require — in the long term —
wholesale architectural changes to our infrastructure and architecture.
The reality is that these changes are arriving in the form of things like virtualization, SaaS, and even the adoption of consumer technologies as they force us to examine what, how and why we do what we do.
Progress is being made and will require continued effort to realize the benefits that are to come.
19. Information Centricity
Building off the themes of SaaS and the de-/re-perimeterization concepts, the notion of what and how we protect our information really comes to light in the topic of information centricity.
You may have heard the term "data-centric" security, but I despise this term because quite frankly, most individuals and companies are overloaded; we’re data rich and information poor.
What we need to do is allow ourselves not to be overwhelmed by the sheer mountains of "data" but rather determine what "information" matters to us most and organize our efforts around protecting it in context.
Today we have networks which cannot provide context and hosts that cannot be trusted to report their status so it’s no wonder we’re in a heapful of trouble.
We need to look at the tenets described in the de-/re-perimeterization topics above and recognize the wisdom of the notion that "…access to data should be controlled by the security attributes of the data itself."
If we think of controlling the flow or "routing" of information by putting in place classification systems that work (content in context…) we have a fighting chance of ensuring that the right data gets to only the right people at the right time.
Without blurring the discussion with the taglines of ERM/DRM, controlling information flow and becoming information centric rather than host or network centric is critically important, especially when you consider the fact that your data is not where you think it is…
20. Next Generation Centers of Data
This concept is clear and concise.
Today the notion of a "data center" is a place where servers go to die.
A "center of data" on the other hand, is an abstraction that points to anywhere where data is created, processed, stored, secured and consumed. That doesn’t mean a monolithic building with a keypad out front and a chunk of A/C and battery backup.
In short, thanks to innovation such as virtualization, grid/utility services, SaaS, de-/re-perimeterization and the consumerization of IT, can you honestly tell me that you know where your data is and why? No.
The next generation centers of data really become the steam that feeds the "data pumps" that power information flow. While in one sense even if the compute stacks may become physically consolidated, the processing and information flow become more distributed.
Processing architectures and operational realities are starting to provide radically different approaches to the traditional data center. Take Sun’s Project Blackbox or Google’s distributed processing clusters, for example. Combined with grid/utility computing models, instead of fixed resource affinity, one looks at pooled sets of resources and distributed computing capacity which are not constrained by the physical brick and mortar wallspaces of today.
If applications, information, processes, storage, backup, and presentation are all distributed across these pools of resources, how can the security of today provide what we need to ensure even the very basic constructs of confidentiality, integrity and availability?
Next we will explore how to take these and future examples of emerging disruptive innovation and map them to a framework which will allow you to begin embracing them rather that reacting to them after the fact.
Recent Comments