Tag Archives: virtualization

Top 3 attributes for businesses to benefit from Data Analytics – an Information Security & Business process perspective

Screen Shot 2013-01-30 at 4.08.18 PMBig Data introduces an opportunity that organizations see when merging silo product operations together forming a service layer or an enhanced hybrid product. Big Data also requires exceptional enterprise intelligence from the perspective of establishing the scaffolding for enterprise grwoth. That scaffolding requires advanced information technology system and business process matrix visibility.  My thesis … let me elaborate below on a single thread here given this is a subject I have been developing on recently…

In order for Big Data to work it requires abundant access to systems, data repositories, and the merging and tweaking of data beyond original data owner expectations or comprehension. The enterprise that balances the advantage of Big Data analytics with superior scaffolding will appreciate higher run rates and profitability without unfunded cost centers and above trend OpEx generally. The opportunity of Big Data without this business intelligence will be squandered and the benefits not realized as a direct result.

The CIO has this ownership and it is the purview of the Audit Committee to ensure that these risks are understood and tackled. The Board of Directors have proven to value equally the aggressiveness of Data Analytics with the ongoing revaluation of the risk tolerance and acceptance points of the business. As one can imagine, this is a familiar yet distinct activity within the executive structure, but three key attributes / activities that indicate a successful approach are as follows:

  1. Vertical awareness – product awareness, strategy, and full line of sight for each major revenue center
  2. Scrum topical teams – risk assessments and activities linked to the product market research initiatives
  3. Senior strategy alignment – what does the Board seek in this DA movement; What does the CEO/CIO envision on these product expansions; What is the audit committee observations (meaning that they must have visibility and mindfulness to the impact)

Think Big Data is not huge business? … consider these figures:

  • Gartner: Big Data Market is Worth $3.7 Trillion, Generating Over 4 Million Jobs by 2015 – article
  • Good short presentation on value of pattern based strategies, by Gartner
  • $29B will be spent on big data throughout 2012 by IT departments.  Of this figure (Forbes)

Or a classic business case example:

“The cornerstone of his [Sam Walton’s] company’s success ultimately lay in selling goods at the lowest possible price, something he was able to do by pushing aside the middlemen and directly haggling with manufacturers to bring costs down. The idea to “buy it low, stack it high, and sell it cheap” became a sustainable business model largely because Walton, at the behest of David Glass, his eventual successor, heavily invested in software that could track consumer behavior in real time from the bar codes read at Wal-Mart’s checkout counters.

“He shared the real-time data with suppliers to create partnerships that allowed Wal-Mart to exert significant pressure on manufacturers to improve their productivity and become ever more efficient. As Wal-Mart’s influence grew, so did its power to nearly dictate the price, volume, delivery, packaging, and quality of many of its suppliers’ products. The upshot: Walton flipped the supplier-retailer relationship upside down.”Changing The Industry Balance of Power

A good (no paywall) article on Forbes here breaks down the IT spent related directly to Big Data and compares against prior years up to 2012 & by industry.  

Also check out this MIT Sloan article co-developed with IBM entitled Big Data, Analytics and the path from Insight to Value  – most interesting for me was page 23 relating to Analytics trumping intuition.  This relates to EVERY business process, product, sales opportunity, accounting, fraud detection, compliance initiative, security analytics, defense and response capabilities, power management, etc …  A worthwhile read for each executive.

Think strategically act vertically and influence horizontally – scale!

James DeLuccia IV

*See me speak at RSA 2013 on the topic – Passwords are Dead

Connected Systems of PCI – Identifying; Securing; Attesting

The payment card industry standard articulates very prescriptively what should be done for all system components that are within the payment card process.  An area of usual confusion is the depth of abstraction that should be applied to the “connected system” element of the standard.  Specifically, the standard states the following:

The PCI DSS security requirements apply to all system components. In the context of PCI DSS, “system components” are defined as any network component, server, or application that is included in or connected to the cardholder data environment. ―”System components” also include any virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors. The cardholder data environment is comprised of people, processes and technology that store, process or transmit cardholder data or sensitive authentication data. Network components include but are not limited to firewalls, switches, routers, wireless access points, network appliances, and other security appliances. Server types include, but are not limited to the following: web, application, database, authentication, mail, proxy, network time protocol (NTP), and domain name server (DNS). Applications include all purchased and custom applications, including internal and external (for example, Internet) applications.

– PCI DSS 2.0 page 10

To simplify – there are the system components that are involved with the payment card process, and then there are the supporting systems (connected systems) that also are in scope of PCI DSS.  An example would be the patch server where the in-scope PCI system is receiving patches (but there are dozens).

So a rule of thumb on scope most offered in the industry is:

If you can digitally communicate with the system it is a connected system (this includes UDP, TCP, etc …) it is in scope.  

A nice write up by Jeff Lowder referring to specifically the security system components can be found here written in 2010.

A Korzybski abstraction problem:

How many levels of abstraction should one undertake?  Meaning – should that same patch server then be examined to see what systems are connecting to it and thus also be included in the ‘connected system’ web?

The answer here is generally no – the abstraction is only one level deep.  That doesn’t mean best practice risk and security practices evaporate, so no leaving that server unpatched on the internet or anything.

What Requirements of PCI DSS should be applied to these ‘connected systems’?

The standard makes it clear in the beginning “The PCI DSS security requirements apply to all…”  So, every PCI control applies to the connected system under discussion and identified through the abstraction of the services supporting the CHD environment itself.  Limitations can be applied to the core system components that make up this “connected system”.  Such as in the virtualization space, the hypervisor risks and controls are differentially applied from the entire standard.  These exceptions from fully applying the PCI standard directly to the connected system must be limited and done with clear awareness.  [Updated: All requirements should be considered … each QSA is different but addressing the risk with an eye towards compliance is the best and safest bet.  Shopping for someone to accept a state of controls is a painful road]

An “Open PCI DSS Scoping Toolkit“(pdf)  published on August 24, 2012 is available that provides an excellent structure in methodically determining scope and the controls that would be applicable.  While not a product of the PCI SSC or a technical group – there is good content here that should be carefully considered as part of every security and compliance strategy.  Thanks to Eric Brothers for the note! [Updated 12/5/2012]

Another good write up is offered here where a good articulation on the two factor authentication exception; file integrity monitoring exception, and a few other practices are nicely elaborated by Andrew Plato (Plus check out the comment threads.. very informative though proceed with caution as this is one QSA interpretation).  Definitely worth a review, though sadly after a review there appears no other elaborations within the PCI SSC on this topic.

This write-up is an exploratory effort to seek clarity by consolidating thoughts of leading individuals in the payment card space and client environment realities.  Any additional insights or perspectives you have are welcomed and requested in the comments below!  I’ll update as anything is shared.

Best,

James DeLuccia

What does the SCADA water pump attack mean to your business…

The ability to attack, compromise, and cause damage has existed since the utility industry began connecting these systems on the Internet.  Examples, including the European nation that was attacked 24+ months ago, are easy to locate.  Yesterday an attack (more proof of concept than anything it could have really been) occurred.  The current public awareness of cyber attacks, the nation state theater risks, and transparency of this action has raised the resulting awareness beyond the closed professional circles within Information Security.    There is a number of interesting writeups and I would suggest carefully reading a few for a balanced perspective.  Two that I would recommend include:

What this means for your Utility company is that the abstract threat modeling exercise that considers these attack vectors should be conducted more thoroughly with real risk and mitigation decisions progressing up to the Board of Directors.

As for everyone else who is a customer of such utility companies, the BCP/DR plans should be updated to reflect the possibility of such a loss of services.  Business enterprise information security / risk management programs (+vendor management) should elevate utility service providers (including cellular operators).  These actions should directly impact the annual/ongoing risk assessments and establish an expectation of security assessment and assurance on a regular basis from these service providers.

It is an interesting quandry that Cloud service providers are vetted and assessed more rigorously than that of Utility service providers, the original cloud.

Thoughts .. challenges?

James DeLuccia iV

Other thoughts?

James

Convergence Risk: Google Chrome and Extensions, at BlackHat 2011

Interesting quotes from guys that demonstrated attack vectors in Google’s Chrome during Blackhat 2011:

“The software security model we’ve been dealing with for decades now has been reframed,” Johansen said.  “It’s moved into the cloud and if you’re logged into bank, social network and email accounts, why do I care what’s stored in your hard drive?”

  • An important illumination regarding the shifting of the risk landscape.  How the user interfaces with data and the system has changed and challenges the current technology controls relied upon to safeguard the intellectual property.
  • What is the effective rate of end-point security (malware / phishing agents, anti-virus) on this new user case?
  • What is being deployed and effective – policy, procedure, technology, a hybrid?

“While the Chrome browser has a sandboxing security feature to prevent an attack from accessing critical system processes, Chrome extensions are an exception to the rule. They can communicate among each other, making it fairly easy for an attacker to jump from a flawed extension to steal data from a secure extension.”

  • Speaks to the issue of convergence of apps that are emerging on iPhones, Androids, respective tablets, TVs, browsers, operating systems, etc…  Similar to the fragmentation attacks of the past – where packets would be innocent separate, but when all received they would reform to something capable of malicious activity.

Interesting extension of risk here is that the platform and / or devices may be trusted and accepted by enterprises, but it is these Apps / Widgets / Extensions that are creating the security scenarios.  This requires a policy and process for understanding the state of these platforms (platforms here including all mobile devices, browsers, and similar App-Loadable environments) beyond the gold configuration build.

Another article on the Google Chrome extension risk described above.

Thoughts?

James DeLuccia

#RSAC Panel on Cloud – security and privacy

Security, Privacy, and liability are top issues for #Cloud security and a popular topic at #RSAC (RSA SFO 2011 Conference)  The first session today was moderated by Drue Reeves (Gartner), Michelle Dennedy (iDennedy), Tanya Forsheit (InfoLawGroup LLP), Archie Reed (HP), and Eran Feigenbaum (Google).  A great discussion and lots of interesting points, and I would highlight ideas for managing security pragmatically for organizations.  Below are my notes.  Apologies for broken flows in logic, as I was trying to capture ideas put forward by panel I sometimes got lost in discussion.

Customers cannot rely only on provider to ensure data confidentiality and compliance, but are seeking assurances.

Cloud-Risks

  • Erin Feigenbaum (Google) – Customers want more transparency in general in the Cloud.  Google is seeing smaller companies move into the cloud and we see that the service and type of cloud sought varies.  Some clouds vary in ability to serve (Gmail in 2010 had uptime of 99.984%).
  • Panel – Due diligence is necessary for both sides of the customer-cloud provider model.  As such must  and get a fair assessment of is happening today for both sides – to know what is happening today.  Understanding what the customer is doing individually to create an ‘honest conversation’.  Create a performance and services assessment of internal (corporate data center and software services) delivery and then determine what Cloud providers meet the current and future state target.  Understanding what is essential to your business is critical to having reasonable expectations and having a proper cost/benefit return.

Legal, procurement, internal audit, business, and technology team members must get together to determine what is important and rate these items.  This then can allow for a better data set identification and procurement of service providers.

  • The end result is the business needs to determine what are their risk tolerance – such as what are they willing to accept.  The universe of Cloud providers allows businesses to identify those that can meet and demonstrate adherence to the criteria that matters to the business.

Focusing on the dataset is what matters and consideration of the period of time.  The dataset released to the cloud must meet your internal safeguard and risk tolerance criteria.

  1. Set Principles first – save money, keep agility, achieve availability
  2. Check application – is it generating revenue; does it create a loss of life scenario
  3. Keeping it in-house does not eliminate the risk vs. having it in the cloud.

Must focus at the strategic level …

Shadow IT, an example:

  • Shadow IT is a problem and is still ongoing.  A security survey with a bank in Canada where the marketing department did a survey in Salesforce.com.  The problem was using the system the data of private Canadian citizens was crossing the U.S. border – which is against the law.  This required a re-architecture effort to correct these activities.

There is a need for awareness and education on the implications of engaging cloud providers and how the flow of datasets impact the business’ legal obligations.

Consumer Technology in Business:

  • Eran – 50% of people surveyed installed applications that are not allowed by their corporations and IT.  The consumerization of technology is creating complex and intertwined technology ecosystems that must be considered by the business, risk management, legal, and security.
  • It is your responsibility to do the due diligence on what the cloud providers are doing to provide assurance, and work with those that provide such information.  The necessity is a balance between providing sufficient information security confidence and mapping out attack vectors for criminals.

Google Growth rate on Cloud:

  • 3,000 new businesses are signing up on the Google cloud every day – impossible to respond uniquely to each one individually.

Data Location

  • It is up to the customer on knowing what are the legal aspects and appropriate uses of the business data.  Understanding the transportation of sensitive data across borders is the business responsibility.
  • It is up to the business to understand and act to protect the data of the business – pushing the information onto a Cloud provider is not a transfer of risk / ownership / responsibility.

If you had the chance today to rebuild your systems, would you do it the same way?

  • Cloud does provide unique technologies beyond what you have already today.  Cloud providers today have allowed them to rebuild their centers that consider today’s technology data architecture and leverage new tech.

Points of reality and impossibility

  • If an organization does not have deep Identity Access Management (IAM) it is poor to try and bolt this on while transitioning to the cloud.  Reasonable expectations must be had for both the consumer and of the cloud provider.

Liability and Allocation between Customers and Clouds

  • Customers with data in their own data centers – they are basically self-insuring their operations.  When moving to the Cloud these customers are now transferring this a third party.  There is a financial aspect here.  How can liability be balanced between customer and service provider?
  • When Customer absorbs all liability they are hesitant to put X data on Cloud.  If Cloud absorbs liability the cost will be to high.

Data in Space

  • People are putting data on the cloud based on rash decisions without unique risk assessments on the data sets and providers.

Agreeing on Liability in the Cloud

  • Organizations have been able to negotiate liability clauses with cloud providers.  Ponemon institute figures are used in determining the limit of liability and are a good way of coming to a proper number that is even with industry figures.  I.e., If Ponemon institute says cost of a breach per record is $224 and business has 20,000 employee records —> The limit of liability should equal the product of these two numbers, and this has proven to be a reasonable discussion with cloud providers.  Indemnification is generally a non-discussion point.
  • The world will move into specialized applications and services.  These point organizations allows for specific legal and technology considerations that are appropriate for that niche.  This is seen at the contract level, notification levels, prioritization on RTR, and across many areas.

Everything is negotiable for the right amount of money or love – Eran

  • Cloud providers do not like to do one-offs.  Cloud providers including Google will negotiate.

APPROACH to cleanse data with confidence

  • Best tip is to encrypt data online… When de-provisioning systems and cleansing .. consider rewriting databases / applications / instances with clean values fully.  Is this a practical method of ensuring the data is satisfied.  How long should the data be in this state to ensure the data is pushed to other parallel instances?
  • Are PCI, SIGS, and such standards for financial services appropriate for the Cloud provider?  The responsibility is always the data owner.  Internal controls must be migrated out to the cloud evenly as applied internally.  It is the business’ risk and responsibility.

Recommendations of the Panel

Archie Reed:  Everyone becomes a broker and recommend that IT teams to embrace this role.  Need to understand how to source, and the chemistry and structure of the IT organization needs to shift.  It will and must include working with the business to have such parties as legal, internal audit, and risk management.

Tanya Forsheit:  I would love to see standards developed and the customers participate in a meaningful way.  The provider side has thought through these seriously over the last few years.  The business to business relationship within the Cloud – Customer relationship is weak.  Be reasonable.

Eran: There is a paradigm shift from a server you can touch and be managed by an Admin that you hired vs. one that is acquired by a contract through a Cloud providers.  Google has over 200 security professionals.  Bank robbers go where the data is – the Cloud has the data.

How do you respond to a vulnerability, how do you respond to a hack … ARE THESE the new / right questions to seek of Cloud providers?

Michelle Dennedy: Leverage and plan for a loss with cloud providers.

Drue:  There are risks you can identify to mitigate risks on the technology side, and there are financial tools (insurance, etc…) that must be deployed.

Question and Answer:

  • Cloud providers have the opportunity to have a dashboard to track and demonstrate controls.  These are hard we know.
  • FedRamp and continuous auditing is a future component of the Cloud providers (that some) will adhere to and demonstrate.

An engaging panel and some interesting and useful points raised.  Welcome any feedback and expansions on the ideas above,

James DeLuccia