Tag Archives: 2014

Continuous Improvement, Audit, and the Agile 2014 Conference .. My lessons

 

Agile 2014 Conference session


Every moment we are learning something new. The greatest challenge is to take advantage of this new information and do something substantial – something real – with it.


As an adventureman in the DevOps / Audit space, I have the privilege of evaluating the opportunities, risks, and future directions for many enterprises. The sophistication of these enterprises spans far and wide – From Fortune 20 companies, 700-person agile teams, to small startups and even smaller teams of five. These companies have one thing in common: a desire to create a business partnership that will accomplish secure, privacy minded, and compliant operations. To put it simply, these companies have the passion and rigor of overcoming a Big 4 audit.

On Wednesday I spoke at the Agile 2014 conference with esteemed author and innovator, Gene Kim. Our session title was, Keeping the Auditor away; DevOps Compliance Case Study. Attendees at this lecture benefited from a 90-person open collaboration and sharing of ideas. A few points resonated with me.

On Leadership:
To lead a product development team requires skill beyond balancing the needs and output of the teams; it requires the talent of connecting the development activities to the governance of the business at the highest control level. The ability to serve the customer is only half of the job description. The other half consists of considering internal business stakeholders (internal auditing, marketing, information security, and compliance procedures).

On Execution:

  • As soon as a process that is efficient and effective is identified, automate as many things as possible
  • Automatically set gates throughout the testing process, against the configurable standards
  • Leverage the application gates with configurable standards to conduct repeatable, verifiable, and scalable operational testing
  • Operational testing must include complete and inline testing
  • Centrally manage versioning of the configurations and deployments
  • The testing executed should reflect internal and external requirements, general security, information security, compliance, development, and audit designed safeguards
  • The output of this testing and the automated gates should result in hard evidence that can be easily presented during audits (ie: logs)

Startups and enterprises alike have the opportunity to be more secure, deploy better product, and achieve balance across controls to audit safeguards beyond those of traditional brick-and-mortar development shops. The basic attributes of success are highlighted above. Add some extreme talent in development, integration with security, compliance, and marketing and success is easily obtainable!

Thank you to everyone who attended and contributed. It was a truly outstanding experience and I look forward to continuing the collaboration. The slides from our presentation are available here.


To see the shared toolkit that is being developed for DevOps and Auditors – visit our shared public work in progress at http://bit.ly/DevOpsAudit.


A special thank you to Gene Kim and all those in the space who welcome everyone with a passion and desire to be a part of something great.

Best,

James DeLuccia

 

 

How do you decide what is Critical vs. Important – Battlefield Leadership series

The Difference Between Critical and Important

The understanding of self and team dynamic is paramount to success in the business world. The definition of success is ‘the achievement of the general objective.’ All too often individuals, teams, and companies lose focus and become distracted during action. Knowing what is important, being able to recognize a distraction, and refocusing resources on what is most critical are the best steps to success under fire.

Hillman Battery

Even today, A walk through Hillman Battery shows the defensive position of the Germans in the immediate path of the British Infantry. The Allies’ most critical task was to liberate Caen after the invasion, but the Allied (British) unit became distracted with destroying a defensive obstacle and resulted in being stalled for an entire day. Ultimately, The Allies were forced to repel counter attacks by the Germans along their flanks which delayed liberation of Caen until July.

If you are unaware of this part of D-Day, you can check out Stephen Ambrose’s book D-Day, which provides some rich details.

Business Reflections…

In business the correlation of ‘team’ and ‘self’ is critical. Often times, important resources are lost when the team is disjointed. For example, wasting time (our most valuable resource!) can occur when you lose sight of the bigger picture. Thus, breaking down the big picture and defining what is important to you and your team allows for clear establishment and allocation of resources.

How does one avoid distractions? How can these be identified, measured, managed, and pushed off? Is the philosophy of saying ‘NO’ to everything but that which is the ultimate goal valuable? How does one position teams to understand the big picture and their critical objectives? Is the communication chain with choke points necessary, or can these be empowered within the teams?

  • Myself: The ‘big picture’ is being a parent directly and in the presence of my daughter. My secondary task is racing, training, and writing to better myself and others.
  • At Ernst & Young: Our Big Picture is realizing vision 2020, the creation of a Better Working World. My teams constantly seeking to create the best security and compliance programs based on global standards that are realized through the eyes of practitioners 
  • What are yours?

 

What is Battlefield Leadership and what is this series about … 

This is the fourth paper in this series. As part of my pursuit to learn and grow, I sought out the excellent management training team at Battlefield Leadership. I am professionally leveraging this across multi-million dollar projects I am overseeing (currently I am the lead executive building global compliance and security programs specifically in the online services / cloud leader space). Personally I am bringing these lessons to bear within my pursuits to cross the chasm. To often I see brilliant technical individuals fail to communicate to very smart business leaders and to the common person on the street. My new book – How Not to be hacked seeks to be a first step in bringing deep information security practices beyond the technologist.

Most exciting the Battlefield group for this training placed it in Normandy France. This allowed for senior executives to be trained in a setting where serious decisions were placed by both sides, and each provided a lesson. This series represents my notes (that I could take down) and takeaways. I share to continue the conversation with those great individuals I met, and with the larger community.

Kind regards,

James

Review – Fmr. CIA Dir. Jim Woolsey warns of existential EMP threat to America

I have been studying First World worst case scenarios where Cyber and life intertwine, and was recommended to review this session.  It is a panel discussion that included former CIA Director on the threat of EMP on the U.S. national infrastructure.

Mr. Woolsey takes roughly the first 10 minutes to set the stage and it is worth listening to help anchor why the NERC/FERC CIP, Executive Order, and the betterment initiatives led by private industry are so important.

A bit of an extreme and not something many ‘concern themselves’ on, but it is important to start translating what information security and cyber mean in a tangible fashion. To often we deal only in probabilities and numbers and forget all else.

Fmr. CIA Dir. Jim Woolsey warns of existential EMP threat to America – YouTube.

Change all your passwords, now.. it is that simple

There is a lot of reason to change passwords and in most business settings passwords are requested to be changed every 90 days. This is usually for the end users and rarely for the system to system accounts. A recent vulnerability creates the possibility that any account that accesses a system on the internet (specifically using HTTPS w/ OpenSSL, but lets not complicate the clarion call here) is exposed and known by someone other than the owner.

By that very condition the password should be changed, and now.

So if you are a person reading this …

  1. Pull up your accounts and begin methodically changing them to a fresh new version (there is a condition here that the site you are updating at has already fixed the vulnerability and has internally followed good practices, but lets presume best scenario here)
  2. Add a note on your calendar 3-4 months from now, to again change the passwords

If you run an technology environment that had OpenSSL installed and was vulnerable, grab a cup of coffee and sandwich, then…

  1. Begin the methodical (perimeter first .. working your way in through layers) and careful task of updating all of the certificates, credentials, and end-user accounts. Also consider end-users too.
  2. Write amazing and clear explanations to the need, value, and importance of this process to your users
  3. Set all users that have accounts accessing your services, to be forced to reset.
  4. Log out (invalidate sessions) all Apps and online cookie sessions (revoke, etc..)
  5. Reissue your private key and SSL certificate
  6. Review and examine your API and third party connections to confirm these are updated, reset, and secured
  7. Add a bit of extra monitoring on the logs for a bit

This is all the result of the Heartbleed.com disclosure, but lets not get technical here .. these are good practices, but now with the probability above 'unlikely', it is a timely habit to re-embrace.

 

Stay safe,

 

James

Big Data is in early maturity stages, and could learn greatly from Infosec :re: Google Flu Trend failure

The concept of analysing large data sets, crossing data sets, and seeking the emergence of new insights and better clarity is a constant pursuit of Big Data. Given the volumn of data being produced by people and computing systems, stored, and ultimately now available for analysis – there are many possible applications that have not been designed.

The challenge with any new 'science', is that the concept to application process can not always be a straight line, or a line that ends where you were hoping. The implications for business using this technology, like the use of Information Security, requires an understanding of it's possibilities and weaknesses. False positives and exaggerations were a problem of past information security times, and now the problem seems almost understated.

An article from Harvard Business details how the Google Flu Trends project failed 100 out of 108 comparable periods. The article is worth a read, but I wanted to highlight two sections below as they relate to business leadership.

The quote picks up where the author is speaking about the problem of the model:

“The first sign of trouble emerged in 2009, shortly after GFT launched, when it completely missed the swine flu pandemic… it’s been wrong since August 2011. The Science article further points out that a simplistic forecasting model—a model as basic as one that predicts the temperature by looking at recent-past temperatures—would have forecasted flu better than GFT.

So in this analysis the model and the Big Data source was inaccurate. There are many cases where such events occur, and if you have ever followed the financial markets and their predictions – you see if more often wrong than right. In fact, it is a psychological (flaw) habit where we as humans do not zero in on those times that were predicted wrong, but those that were right. This is a risky proposition in anything, but it is important for us in business to focus on the causes of such weakness and not be distracted by false positives or convenient answers.

The article follows up the above conclusion with this statement relating to the result:

“In fact, GFT’s poor track record is hardly a secret to big data and GFT followers like me, and it points to a little bit of a big problem in the big data business that many of us have been discussing: Data validity is being consistently overstated. As the Harvard researchers warn: “The core challenge is that most big data that have received popular attention are not the output of instruments designed to produce valid and reliable data amenable for scientific analysis.”

The quality of the data is challenged here for being at fault, and I would challenge that ..

The analogy is from information security where false positives and such trends were awful in the beginning and have become much better overtime. The key inputs of data and the analysis within information security is from sources that are commonly uncontrolled and certainly not the most reliable for scientific analysis. We live in a (data) dirty world, where systems are behaving as unique to the person interfacing them.

We must continue to develop tolerances in our analysis within big data and the systems we are using to seek benefit from them. This clearly must balance criticism to ensure that the source and results are true, and not an anomaly.

Of course, the counter argument .. could be: if the recommendation is to learn from information security as it has had to live in a dirty data world, should information security instead be focusing on creating “instruments designed to produce valid and reliable data amenable for scientific analysis”? Has this already occurred? At every system component?

A grand adventure,

James