Tag Archives: james deluccia

Big Data is in early maturity stages, and could learn greatly from Infosec :re: Google Flu Trend failure

The concept of analysing large data sets, crossing data sets, and seeking the emergence of new insights and better clarity is a constant pursuit of Big Data. Given the volumn of data being produced by people and computing systems, stored, and ultimately now available for analysis – there are many possible applications that have not been designed.

The challenge with any new 'science', is that the concept to application process can not always be a straight line, or a line that ends where you were hoping. The implications for business using this technology, like the use of Information Security, requires an understanding of it's possibilities and weaknesses. False positives and exaggerations were a problem of past information security times, and now the problem seems almost understated.

An article from Harvard Business details how the Google Flu Trends project failed 100 out of 108 comparable periods. The article is worth a read, but I wanted to highlight two sections below as they relate to business leadership.

The quote picks up where the author is speaking about the problem of the model:

“The first sign of trouble emerged in 2009, shortly after GFT launched, when it completely missed the swine flu pandemic… it’s been wrong since August 2011. The Science article further points out that a simplistic forecasting model—a model as basic as one that predicts the temperature by looking at recent-past temperatures—would have forecasted flu better than GFT.

So in this analysis the model and the Big Data source was inaccurate. There are many cases where such events occur, and if you have ever followed the financial markets and their predictions – you see if more often wrong than right. In fact, it is a psychological (flaw) habit where we as humans do not zero in on those times that were predicted wrong, but those that were right. This is a risky proposition in anything, but it is important for us in business to focus on the causes of such weakness and not be distracted by false positives or convenient answers.

The article follows up the above conclusion with this statement relating to the result:

“In fact, GFT’s poor track record is hardly a secret to big data and GFT followers like me, and it points to a little bit of a big problem in the big data business that many of us have been discussing: Data validity is being consistently overstated. As the Harvard researchers warn: “The core challenge is that most big data that have received popular attention are not the output of instruments designed to produce valid and reliable data amenable for scientific analysis.”

The quality of the data is challenged here for being at fault, and I would challenge that ..

The analogy is from information security where false positives and such trends were awful in the beginning and have become much better overtime. The key inputs of data and the analysis within information security is from sources that are commonly uncontrolled and certainly not the most reliable for scientific analysis. We live in a (data) dirty world, where systems are behaving as unique to the person interfacing them.

We must continue to develop tolerances in our analysis within big data and the systems we are using to seek benefit from them. This clearly must balance criticism to ensure that the source and results are true, and not an anomaly.

Of course, the counter argument .. could be: if the recommendation is to learn from information security as it has had to live in a dirty data world, should information security instead be focusing on creating “instruments designed to produce valid and reliable data amenable for scientific analysis”? Has this already occurred? At every system component?

A grand adventure,

James

 

What do major developments in big data, cloud, mobile, and social media mean? A CISO perspective..

Screen Shot 2013-02-26 at 6.52.56 PM

Tuesday afternoon the CISO-T18 – Mega-Trends in Information Risk Management for 2013 and Beyond: CISO Views session as presented focused on the results of a survey sponsored by RSA (link below).  It provided a back drop for some good conversation, but more so it gave me a nice environment to elaborate on some personal observations and ideas.  The first tweet I sent, hammered the main slide:

“Major developments with Big Data, Cloud, Mobile, and Social media” – the context and reality here is cavernous.. “

My analysis and near-random break down of this tweet are as follows with quotes pulled from the panel.

First off – be aware that these key phrases / buzz words mean different things to different departments and from each level (strategic executives through tactical teams). Big Data analytics may not be a backend operational pursuit, but a revenue generating front end activity (such as executed by WalMart). These different instantiations are likely happening at different levels with varied visibility across the organization.

Owning” the IT infrastructure is not a control to prevent the different groups from launching to these other ‘Major developments’.

The cost effectiveness of the platforms designed to serve businesses (i.e., Heroku, Puppet Labs, AWS, etc…) is what is defining the new cost structure. CIO and CISO must

>The cloud is not cheaper if it does have any controls. This creates a risk of the data being lost due to “no controls” – highlighted by Melanie from the panel.  <– I don’t believe this statement is generally true and generally FUD.

Specifically – There is a service level expectation by cloud service providers to compensate for the lack of audit ability those “controls”. There are motions to provide a level of assurance to these cloud providers beyond the ancient method established through ‘right to audit‘.

A method of approaching these challenging trends, specifically Big Data, below as highlighted by one of the CISO (apologies missed his name) w/ my additions:

  • Data flow mapping is a key to providing efficient and positive ‘build it’ product development. It helps understand what matters (to support and have it operational), but also see if anything is breaking as a result.
  • Breaking = violating a contract, breaking a compliance requirement, or negatively effecting other systems and user requirements.

Getting things Done – the CISO 

Two observations impacting the CISO and information technology organization include:

  1. The Board is starting to become aware and seeking to see how information security is woven within ERM
  2. Budgets are not getting bigger, and likely shrinking due to expectations of productivity gains / efficiency / cloud / etc…

Rationalization on direction, controls, security responses, must be be fast for making decisions and executing…

Your ability to get things done has little do with YOU doing things, but getting others to do things. Enabling, partnering, and teaming is what makes the business move. CIO and CISO must create positive build-it inertia.

Support and partner with the “middle management” the API of the business if you will.

  • We to often focus on “getting to the board” and deploying / securing the “end points” .. Those end points are the USERS and between them and the Board are your API to achieving your personal objectives.

Vendor Management vs procurement of yester-year

Acquiring the technology and services must be done through a renewed and redeveloped vendor management program. The current procurement team’s competencies are inadequate and lacking the toolsets to ensure these providers are meeting the existing threats. To be a risk adaptive organization you must tackle these vendors with renewed. Buying the cheapest parts and service today does not mean what it meant 10 years ago. Today the copied Cisco router alternative that was reverse engineered lacks an impressive amount of problems immediately after acquisition. Buying is easy – it is the operational continuance that is difficult. This is highlighted by the 10,000+ vulnerabilities that exist with networked devices that will never be updated within corporations that must have their risks mitigated, at a very high and constant cost.

Panel referenced the following report:

http://www.emc.com/microsites/rsa/security-for-business-innovation-council.htm

Thank you to the panel for helping create a space to think and seek answers, or at least more questions!

James DeLuccia IV

Information Security executives … is responsibility being abdicated?

Is “it is your decision not ours” statement and philosophy a cop-out within the Information Security sphere?

This is a common refrain and frustration I hear across the world of information security and information technology.  Is this true?  Is it the result of personality types that are attracted to these roles?  Is it operational and reporting structure?

In Audit it is required for independence and given visibility. Does not the business (the CIO) and the subject expertise (CISO) not have that visibility possess a requirement of due care to MAKE it work?

The perfect analogy is the legal department – they NEVER give in and walk away with a mumble, they present their case until all the facts are known and a mutual understanding is reached. Balance happens but it happens with understanding.

This point is so important to me, that it warranted a specific sharing of the thought.  I hope we can reframe our approach, and to follow a presentation off TED – focus on the WHY.  (need to find link…sorry)   These individuals in these roles provide the backbone and customer facing layer of EVERY business.

Thoughts and realizations made from stumbling around our community and today during RSA resulting from the presentations with underlying tones.

Always seek,

James DeLuccia

My RSA Conference Notes and perspective – Tuesday AM 2013

Today kicked off, for me, the RSA conference. The best part of these types of events is the onslaught of ideas shared between peers – generally through networking and random encounters in hallways (such as bumping into Bill Brenner). Thanks first off to RSA for creating the forum for these discussions to occur.

I have the privilege of speaking tomorrow, and look forward to the debate and flow of ideas that will ensue.
While reviewing some of the research provided to attendees, I had the following observations, and wanted to share them in entirety for debate and expansion:

Vendor management by procurement SHOULD include data plus asset chain of custody, and #infosec assurance to YOUR standards#RSAC

So basically – costs per breach are up; # attacks higher; 6 more days to resolve, & the same forms of attack #rsachttp://lockerz.com/s/285234702

Aren’t costs per breach up in 2012 to $8.9 million the result of our greater leverage of information technology & resulting value!

Most botnet, malware, & C&C operators manage MORE devices; across WIDER geographies, & generate a positive ROI. How is your information security?

#rsac Art’s presentation was good. Agree with Taleb perspective, but it must applied at Org to match robustness #infosec

Art Coviello gave an impassioned presentation that I thought was very good for a keynote at that level. Typically there is a risk of sales (which did occur at the end, of course) material, but a couple good analogies and mental positioning. I thought his analogy to Nassim Taleb’s AntiFragile was on point (and funny since I am 1/3 through it, so very fresh in the mind) for the security operations against the cyber threats. I would expand it though to include the business process and information security compliance program. I have found that the block and tackle of information security itself needs to be robust and antifragile. The lacking of these elements forfeits the benefits of the threat intelligence he describes.

This is especially poignant to me given the relative lack of volatility in the type of attacks that succeed against organizations, and their ongoing effectiveness in breaching our company defenses.

If you are looking to enjoy the keynotes (I would recommend at least Art and Scott Charney) live or on-demand here.

RSA thoughts and sessions .. to be continued ..

Best,

James DeLuccia

My RSA 2013 Conference Session details

292927main_larry_prusak

I am looking forward to seeing the world in San Francisco for the RSA Conference this year!  It is always such a rich experience speaking with everyone throughout the week.  I have the privilege of speaking during one of the sessions, and invite all to stop by before and after for greater dialogue.

I am open to all suggestions on new research and new ideas in the ongoing adventure of developing information technology organizations balancing security and compliance.  A good deal of interest in managing the complexities of the abstraction of services and challenging the assumptions of our time.

You can reach me @jdeluccia during the event.

Here is the link to my RSA Conference details.

Always seeking,

James DeLuccia IV

Passwords are Dead, Part II 2nd False Premise – a collaborative research effort, being presented at RSA 2013

The advent of user created, managed and handled passwords as the sole means of authenticating is coming to an end. The utility of these was defined in an era based on assumptions of brute force capability, system computing power and pro-active security teams.   - After much debate and analysis … there is the thesis

Screen Shot 2013-02-12 at 9.58.14 AM

This is Part II of the topic being explored and discussed at my Wednesday session at the RSA Conference in San Francisco (2013).  To see the first thesis and False Premise 1, please see the original post.  Jumping right in – looking forward to more feedback (thanks for a generous emails, but don’t be shy at the comment field below)!

————————————————————————

FALSE PREMISE TWO: Password strength should transcend devices – mobile, tablets (iPad, surface) [Updated 2/12/2013]

MOBILE devices:
What is the intent of the password? To stop high CPU encryption cracking systems .. or prevent inadvertent strangers from accessing the data?  Today we wrap in mobile (BYOD type if that suits you) systems into the corporate password requirement sphere, and in some cases are being more creative than other platforms.

For instance, it is recommended on a popular Apple iOS device site to use “accent characters for creating a super strong password“. Agreed these are more difficult to guess, but is that the threat we are seeking to mitigate?  In the space of X character spaces how creative must we get?

What are the risks to these mobile devices:

  • Theft
  • Data leakage violating regulatory, contractual, or privacy expectations of customers

If we consider the two threats – Theft is not mitigated by the password, as the device will simply be wiped.

[Updated 2/09/13] Data leakage is only possible if the device is ON and the password guessed before it locks itself permanently.  A feature readily available and easily implemented by the end-user, even more robust with corporate implementation technologies.

  • So in this case, the password only needs to not be one of the top 10 most common phone passwords.  At that point the device locks and can self wipe.
  • Another scenario is that the password was gleaned through recording / shoulder surfing / or simply left unlocked.  Each case the password strength was not an issue.  Other situations?

As we move into an ever mobile, data everywhere, and always connected scenario an interesting ecosystem of access & authentication appears, that requires continued serious challenge against the assumptions of our security and assurance programs.

Diving in …

Data is mobile – what role does a single password play in accessing sensitive data? Data stored on device (Cloud storage we can address on the integration point below) is at risk to a number of threats:

  • The device can be attacked directly (similar to any other computing device with IP addresses and Ports) wirelessly, but typically requires physical proximity (simplest) which is reserved for either random or very targeted attackers.
  • The device can be stolen, and if no OS passwords, than the Data itself is attacked/accessed directly. An unlocked device introduces risk mitigation techniques that are harder, so password is EASIEST. A password on the data within an application is a worthless without some form of self-destruct functionality similar to that of the OS level safeguards.

>> Why are passwords WORTHLESS at the application level in this situation?

>>> If the attacker is ON the device (physically or remotely) and our Use Case is an encrypted database – the attacker can copy that encrypted database to their system for local attacking (easy and zero user awareness), or they can access the database locally via brute force until they get in.

The data is at risk regardless without some form of self-destruct and tremendous levels of assurance related to the encryption of the data(base) itself.

  • Other thoughts here?
  • What is missing?

Passwords plays a significant role at certain tollgates upon the data (when stored on the device), and less the more “access” the attacker gets to the underlying system. A common refrain of attackers is – with “physical” access I can break into anything. We must today deal with ALL ACCESS is PHYSICAL when the data is mobile.

Plethora of devices – Today data is accessed from many devices, some owned by corporations, by end-users, or nobody – kiosks. Single passwords entered into systems allowing single thread authentication where NO assurance is understood of the underlying system and no situational awareness of the User presence seeking authentication results in failed security.

  • The reuse of passwords across devices threatens the confidentiality of the password itself (as much as that matters).
  • The multitude of devices increases the need to redefine what is “access” and the functions of authorization (I used “functions” instead of “rules” intentionally to draw attention on the necessity for a broader approach to solving this constraint)

Integration with third party service providers – [to be expanded...]

—————————-

Conclusion – a preview:

  1. Stationarity, is defined as a quality of a process in which the statistical parameters (mean and standard deviation) of the process do not change with time.” – Challis and Kitney November 1991
  2. Offline Data level authentication – Offline in an ‘always connected’ world

[Disclaimer: First off this is my research and not anyone else's. Second, the examples above are meant to illustrate technical realities in a reasonably understood presentation. Lets focus on the problem .. identify weaknesses in the argument; and introduce the mitigation so greatly required in our online world.

I share and seek these answers for the preservation and enhancement for our way of life… as simple as that and I appreciate you being a part of my journey]

Always seek, everything…

James DeLuccia

Twitter: @jdeluccia

Passwords are Dead – a collaborative research effort, being presented at RSA 2013 P1

The advent of user created, managed and handled passwords as the sole means of authenticating is coming to an end. The utility of these was defined in an era based on assumptions of brute force capability, system computing power and pro-active security teams.   - After much debate and analysis … there is the thesis

Screen Shot 2013-02-04 at 3.36.28 PMThis topic came up for me last year as I was working through some large amorphous business processes. The question of credentials was raised, and we challenged it. This is interesting as we had some pretty serious brains in the room from the house of auditing, security, risk, and business leaders. I am sharing my thoughts here to seek input and additional alternate perspectives – seeking more ‘serious brains’.  

I will update as feedback comes in … this and other posts will serve as workspaces to share the analysis and perspectives to consider.  I am breaking this topic across different posts to allow for edits and pointed (critical perhaps) feedback on a topic basis.  This is LIVE research, so understand impressions today may change tomorrow based on information and insight. Looking forward to collaborating, and with that … lets jump right in!

————————————————————————

Passwords are designed to restrict access by establishing confirmation that the entity accessing the system is in-fact authorized. This is achieved by authenticating that user. Passwords / pass phrases have been the ready steady tool. The challenges to this once golden child cross the entire sphere, and I’ll be seeking your collaboration through the journey up to my RSA presentation in SFO at the end of February 2013!

  • False premise one – Passwords are good because they cannot be cracked
  • False premise two – Password strength should transcend devices – mobile, tablets (iPad, surface)
  • False premise three – Password control objectives are disassociated from the origination and intent

FALSE PREMISE ONE: (Updated Jan.31.2013)

  • Passwords are great because they are difficult to break?

The idea here is that users are trained (continuously) to use complex, difficult, long, and unique passwords. The concept was that these attributes made it difficult for a password to be broken.

Lets explore what that meant… When a password was X characters long using Y variety of symbols it would take a computer Z time to break it. Pretty straight forward. (This example drawn is for a password hash that is being brute force attacked offline) This analogy and logic is also true with encryption, but it is based on poor premise:

  1. Password cracking CPU cycles for a single machine are far more powerful than yesteryear, AND if we focus ONLY only on computing power, well the use of Cloud Armies to attack represent the new advantage for the cracking team
  2. Password cracking by comparison pretty much made the CPU argument (and length of time to hack) moot. There exists databases FULL of every single password hash (for each type of encryption / hash approach) that can be compared against recovered passwords – think 2 excel tables .. search for hash in column A and find real world password in column B.

Interesting selective supporting facts:

  • A $3000 computer running appropriate algorithms can make 33 billion password guesses every second with a tool such as whitepixel
  • A researcher from Carnegie Mellon developed an algorithm designed for cracking long passwords that are made up of combined set of words in a phrase (a common best practice advice) – “Rao’s algorithm makes guesses by combining words and phrases from password-cracking databases into grammatically correct phrases.” This is research is being presented in San Antonio at the “Conference on Data and Application Security & Privacy” – New Scientist

Humans also pick awful passwords …

  • Based on habit
  • We trend towards the same passwords
  • Based on grammer
  • Our punctuation and writing habits also lend towards identification and passwords

To be continued ….. Part 2 and 3 will be shared soon, looking forward to more collaboration!

Keep seeking, everything.

- James DeLuccia IV

@JDELUCCIA