Tag Archives: information security

Review – Fmr. CIA Dir. Jim Woolsey warns of existential EMP threat to America

I have been studying First World worst case scenarios where Cyber and life intertwine, and was recommended to review this session.  It is a panel discussion that included former CIA Director on the threat of EMP on the U.S. national infrastructure.

Mr. Woolsey takes roughly the first 10 minutes to set the stage and it is worth listening to help anchor why the NERC/FERC CIP, Executive Order, and the betterment initiatives led by private industry are so important.

A bit of an extreme and not something many ‘concern themselves’ on, but it is important to start translating what information security and cyber mean in a tangible fashion. To often we deal only in probabilities and numbers and forget all else.

Fmr. CIA Dir. Jim Woolsey warns of existential EMP threat to America – YouTube.

Change all your passwords, now.. it is that simple

There is a lot of reason to change passwords and in most business settings passwords are requested to be changed every 90 days. This is usually for the end users and rarely for the system to system accounts. A recent vulnerability creates the possibility that any account that accesses a system on the internet (specifically using HTTPS w/ OpenSSL, but lets not complicate the clarion call here) is exposed and known by someone other than the owner.

By that very condition the password should be changed, and now.

So if you are a person reading this …

  1. Pull up your accounts and begin methodically changing them to a fresh new version (there is a condition here that the site you are updating at has already fixed the vulnerability and has internally followed good practices, but lets presume best scenario here)
  2. Add a note on your calendar 3-4 months from now, to again change the passwords

If you run an technology environment that had OpenSSL installed and was vulnerable, grab a cup of coffee and sandwich, then…

  1. Begin the methodical (perimeter first .. working your way in through layers) and careful task of updating all of the certificates, credentials, and end-user accounts. Also consider end-users too.
  2. Write amazing and clear explanations to the need, value, and importance of this process to your users
  3. Set all users that have accounts accessing your services, to be forced to reset.
  4. Log out (invalidate sessions) all Apps and online cookie sessions (revoke, etc..)
  5. Reissue your private key and SSL certificate
  6. Review and examine your API and third party connections to confirm these are updated, reset, and secured
  7. Add a bit of extra monitoring on the logs for a bit

This is all the result of the Heartbleed.com disclosure, but lets not get technical here .. these are good practices, but now with the probability above 'unlikely', it is a timely habit to re-embrace.

 

Stay safe,

 

James

Big Data is in early maturity stages, and could learn greatly from Infosec :re: Google Flu Trend failure

The concept of analysing large data sets, crossing data sets, and seeking the emergence of new insights and better clarity is a constant pursuit of Big Data. Given the volumn of data being produced by people and computing systems, stored, and ultimately now available for analysis – there are many possible applications that have not been designed.

The challenge with any new 'science', is that the concept to application process can not always be a straight line, or a line that ends where you were hoping. The implications for business using this technology, like the use of Information Security, requires an understanding of it's possibilities and weaknesses. False positives and exaggerations were a problem of past information security times, and now the problem seems almost understated.

An article from Harvard Business details how the Google Flu Trends project failed 100 out of 108 comparable periods. The article is worth a read, but I wanted to highlight two sections below as they relate to business leadership.

The quote picks up where the author is speaking about the problem of the model:

“The first sign of trouble emerged in 2009, shortly after GFT launched, when it completely missed the swine flu pandemic… it’s been wrong since August 2011. The Science article further points out that a simplistic forecasting model—a model as basic as one that predicts the temperature by looking at recent-past temperatures—would have forecasted flu better than GFT.

So in this analysis the model and the Big Data source was inaccurate. There are many cases where such events occur, and if you have ever followed the financial markets and their predictions – you see if more often wrong than right. In fact, it is a psychological (flaw) habit where we as humans do not zero in on those times that were predicted wrong, but those that were right. This is a risky proposition in anything, but it is important for us in business to focus on the causes of such weakness and not be distracted by false positives or convenient answers.

The article follows up the above conclusion with this statement relating to the result:

“In fact, GFT’s poor track record is hardly a secret to big data and GFT followers like me, and it points to a little bit of a big problem in the big data business that many of us have been discussing: Data validity is being consistently overstated. As the Harvard researchers warn: “The core challenge is that most big data that have received popular attention are not the output of instruments designed to produce valid and reliable data amenable for scientific analysis.”

The quality of the data is challenged here for being at fault, and I would challenge that ..

The analogy is from information security where false positives and such trends were awful in the beginning and have become much better overtime. The key inputs of data and the analysis within information security is from sources that are commonly uncontrolled and certainly not the most reliable for scientific analysis. We live in a (data) dirty world, where systems are behaving as unique to the person interfacing them.

We must continue to develop tolerances in our analysis within big data and the systems we are using to seek benefit from them. This clearly must balance criticism to ensure that the source and results are true, and not an anomaly.

Of course, the counter argument .. could be: if the recommendation is to learn from information security as it has had to live in a dirty data world, should information security instead be focusing on creating “instruments designed to produce valid and reliable data amenable for scientific analysis”? Has this already occurred? At every system component?

A grand adventure,

James

 

How to determine how much money to spend on security…

A question that many organizations struggle with is how much is the appropriate money to spend annually per user, per year on information security. While balancing security, privacy, usability, profitability, compliance, and sustainability is an art organization's have a new data point to consider.

Balancing – information security and compliance operations

The ideal approach that businesses take must always be based on internal and external factors that are weighted against the risks to their assets (assets in this case is generally inclusive of customers, staff, technology, data, and physical-environmental). An annual review identifying and quantifying the importance of these assets is a key regular exercise with product leadership, and then an analysis of the factors that influence those assets can be completed.

Internal and external factors include a number of possibilities, but key ones that rise to importance for business typically include:

  1. Contractual committments to customers, partners, vendors, and operating region governments (regulation)
  2. Market demands (activities necessary to match the market expectations to be competitive)

At the aggregate and distributed based upon the quantitative analysis above, safeguards and practices may be deployed, adjusted, and removed. Understanding the economic impact of the assets and the tributary assets/business functions that enable the business to deliver services & product to market allows for a deeper analysis. I find the rate of these adjustments depend on the business industry, product cycle, and influenced by operating events. At the most relaxed cadence, these would happen over a three year cycle with annual minor analysis conducted across the business.

Mature organization's would continue a cycle of improvement (note – improvement does not mean more $$ or more security / regulation, but is improvement based on the internal and external factors and I certainly see it ebbing and flowing)

Court settlement that impacts the analysis and balance for information security & compliance:

Organization's historically had to rely on surveys and reading of the tea leaf financial reports where costs of data breaches and FTC penalties were detailed. These collections of figures showed the cost of a data breach anywhere between $90-$190 per user. Depending on the need, other organizations would baseline costing figures against peers (i.e., do we all have the same # of security on staff; how much of a % of revenue is spent, etc…).

As a result of a recent court case, I envision the below figures to be joined in the above analysis. It is important to consider a few factors here:

  1. The data was considered sensitive (which could be easily argued across general Personally Identifiable Information or PII)
  2. There was a commitment to secure the data by the provider (a common statement in many businesses today)
  3. The customers paid a fee to be with service provider (premiums, annual credit card fees, etc.. all seem very similar to this case)
  4. Those that had damages and those that did not were included within the settlement

The details of the court case:

The parties' dispute dates back to December 2010, when Curry and Moore sued AvMed in the wake of the 2009 theft of two unencrypted laptops containing the names, health information and Social Security numbers of as many as 1.2 million AvMed members.

The plaintiffs alleged the company's failure to implement and follow “basic security procedures” led to plaintiffs' sensitive information falling “in the hands of thieves.” – Law360

A settlement at the end of 2013, a new fresh input:

“Class members who bought health insurance from AvMed can make claims from the settlement fund for $10 for each year they bought insurance, up to a $30 cap, according to the motion. Those who suffered identify theft will be able to make claims to recover their losses.”

For businesses conducting their regular analysis this settlement is important as the math applied here:

$10 x (# of years a client) x client = damages .. PLUS all of the upgrades required and the actual damages impacting the customers.

Finally

Businesses should update their financial analysis with the figures and situational factors of this court case. This will in some cases reduce budgets, but others where service providers have similar models/data the need for better security will be needed.

As always, the key is regular analysis against the internal & external factors to be nimble and adaptive to the ever changing environment. While balancing these external factors, extra vigilance needs to ensure the internal asset needs are being satisfied and remain correct (as businesses shift to cloud service providers and through partnering, the asset assumption changes .. frequently .. and without any TPS memo).

Best,

James

 

Tactical Issue: How to handle Executive Assistants and #infosec

Problem Statement: How have you seen companies handle executive assistant's access to C-level and VP accounts? Our executives heavily rely on their admins but don't realize the risk when we go to single sign on.

How does this apply to you?

As organizations grow and expand there is a sensitivity of access to data, and especially if businesses are in an M&A mode, there is much higher sensitivity at the executive level. Data protection and limitaiton of access is dependent upon the specific instance.

If an organization, such as the question above, allows (and most do) admins / executive assistants to access senior leadership files then what do you do?

  1. Trust explicity, same credentials and access as the executives they represent
  2. Trust per instance, same credentials but institute specific 'special handling protocols' for items that are too sensitive
  3. No trust.. this is unlikely to succeed unless there are no admins, given the sneaker-net still works beyond many other cultural and personnel inherent issues at large here

Solution Concepts:

there are many ways to approach this problem statement, but a few responses to each of the above (I'll reference each bullet number above for simplicity)

  1. Admins/executive assistants go through the same background security vetting as their assigned executives, and the systems themselves have escalated monitoring. Essentially deep background checks, ongoing personnel monitoring, and better system security for the end-user devices.
  2. By far the easiest – special handling protocols for executives would be the initiation of secure platforms, encrypted containers, electronic document handling authenticated to specific systems, even project code names, etc ..
  3. These do happen, but definitely require the culture to accept the extreme firewalling (socially) of discussions and work. Not appropriate for many organizations today.

Final Thoughts:

I spend most of my time designing, implementing, and operating global security programs for businesses… so this tactical question was fun to receive. Working in the details is where life happens, and is proof point for many innovations. Smashing together technology, process, and people is an art .. a journey .. and always unique.

Hope this helps.

James

My RSA Conference Notes and perspective – Tuesday AM 2013

Today kicked off, for me, the RSA conference. The best part of these types of events is the onslaught of ideas shared between peers – generally through networking and random encounters in hallways (such as bumping into Bill Brenner). Thanks first off to RSA for creating the forum for these discussions to occur.

I have the privilege of speaking tomorrow, and look forward to the debate and flow of ideas that will ensue.
While reviewing some of the research provided to attendees, I had the following observations, and wanted to share them in entirety for debate and expansion:

Vendor management by procurement SHOULD include data plus asset chain of custody, and #infosec assurance to YOUR standards#RSAC

So basically – costs per breach are up; # attacks higher; 6 more days to resolve, & the same forms of attack #rsachttp://lockerz.com/s/285234702

Aren’t costs per breach up in 2012 to $8.9 million the result of our greater leverage of information technology & resulting value!

Most botnet, malware, & C&C operators manage MORE devices; across WIDER geographies, & generate a positive ROI. How is your information security?

#rsac Art’s presentation was good. Agree with Taleb perspective, but it must applied at Org to match robustness #infosec

Art Coviello gave an impassioned presentation that I thought was very good for a keynote at that level. Typically there is a risk of sales (which did occur at the end, of course) material, but a couple good analogies and mental positioning. I thought his analogy to Nassim Taleb’s AntiFragile was on point (and funny since I am 1/3 through it, so very fresh in the mind) for the security operations against the cyber threats. I would expand it though to include the business process and information security compliance program. I have found that the block and tackle of information security itself needs to be robust and antifragile. The lacking of these elements forfeits the benefits of the threat intelligence he describes.

This is especially poignant to me given the relative lack of volatility in the type of attacks that succeed against organizations, and their ongoing effectiveness in breaching our company defenses.

If you are looking to enjoy the keynotes (I would recommend at least Art and Scott Charney) live or on-demand here.

RSA thoughts and sessions .. to be continued ..

Best,

James DeLuccia

My RSA 2013 Conference Session details

292927main_larry_prusak

I am looking forward to seeing the world in San Francisco for the RSA Conference this year!  It is always such a rich experience speaking with everyone throughout the week.  I have the privilege of speaking during one of the sessions, and invite all to stop by before and after for greater dialogue.

I am open to all suggestions on new research and new ideas in the ongoing adventure of developing information technology organizations balancing security and compliance.  A good deal of interest in managing the complexities of the abstraction of services and challenging the assumptions of our time.

You can reach me @jdeluccia during the event.

Here is the link to my RSA Conference details.

Always seeking,

James DeLuccia IV