Tag Archives: audit

Copying access control cards is easier w/ $10 device being released at BlackHat 2015

Proximity access cards are no more secure than a standard key .. and easily replicated with a $10 (to be released) tool. This was shared on ZDNet and with Motherboard. I have highlighted 2 key sections below for those interested in greater detail definitely check out the article. If you are lucky enough to see the presentation live at BlackHat, that will surely be better.

While RFID technology can help secure enterprise offices in this way, the ease in which these access controls can be hacked has hit the spotlight in the form of a tiny device which costs only $10 to make.

Researchers Mark Baseggio from security firm Accuvant and Eric Evenchick from Faraday Future are the developers of the Bluetooth Low Energy device (BLEKey), a coin-sized device which skims RFID cards, allowing users to clone items such as access cards.The team says the release of the tool is “valuable for understanding the risks associated with insecure access controls and what steps companies can take to lower the risk of access control attacks.” – ZDNet Article

I would raise the point that these attacks can now be down so easily that can the “control” of access control physically be fully trusted from a third party assurance perspective, an industry perspective such as PCI, or risk management? One could argue that cameras support this protection, but those are only employed after damage has been discovered and insufficient for all of the stakeholders involved.

“We wanted to create a device that would concretely and absolutely show and hopefully put the final nail in the coffin that is HID prox and Wiegand. These devices are no more secure than a standard key.” – Motherboard, Baseggio

The difference though with a ‘standard key’ is that takes some crafty spy work to make a copy without the owner being aware. To copy a HID card would take only seconds – at a gym, lanyards left at a desk, etc …

Glad the research cycle is exposing these risks and looking forward to creative approaches to counter it.


p.s. My new book – How not to be hacked is available and is PERFECT for your family and friends who keep getting smashed by online criminals, malware, and account hijacks!

Continuous Improvement, Audit, and the Agile 2014 Conference .. My lessons


Agile 2014 Conference session

Every moment we are learning something new. The greatest challenge is to take advantage of this new information and do something substantial – something real – with it.

As an adventureman in the DevOps / Audit space, I have the privilege of evaluating the opportunities, risks, and future directions for many enterprises. The sophistication of these enterprises spans far and wide – From Fortune 20 companies, 700-person agile teams, to small startups and even smaller teams of five. These companies have one thing in common: a desire to create a business partnership that will accomplish secure, privacy minded, and compliant operations. To put it simply, these companies have the passion and rigor of overcoming a Big 4 audit.

On Wednesday I spoke at the Agile 2014 conference with esteemed author and innovator, Gene Kim. Our session title was, Keeping the Auditor away; DevOps Compliance Case Study. Attendees at this lecture benefited from a 90-person open collaboration and sharing of ideas. A few points resonated with me.

On Leadership:
To lead a product development team requires skill beyond balancing the needs and output of the teams; it requires the talent of connecting the development activities to the governance of the business at the highest control level. The ability to serve the customer is only half of the job description. The other half consists of considering internal business stakeholders (internal auditing, marketing, information security, and compliance procedures).

On Execution:

  • As soon as a process that is efficient and effective is identified, automate as many things as possible
  • Automatically set gates throughout the testing process, against the configurable standards
  • Leverage the application gates with configurable standards to conduct repeatable, verifiable, and scalable operational testing
  • Operational testing must include complete and inline testing
  • Centrally manage versioning of the configurations and deployments
  • The testing executed should reflect internal and external requirements, general security, information security, compliance, development, and audit designed safeguards
  • The output of this testing and the automated gates should result in hard evidence that can be easily presented during audits (ie: logs)

Startups and enterprises alike have the opportunity to be more secure, deploy better product, and achieve balance across controls to audit safeguards beyond those of traditional brick-and-mortar development shops. The basic attributes of success are highlighted above. Add some extreme talent in development, integration with security, compliance, and marketing and success is easily obtainable!

Thank you to everyone who attended and contributed. It was a truly outstanding experience and I look forward to continuing the collaboration. The slides from our presentation are available here.

To see the shared toolkit that is being developed for DevOps and Auditors – visit our shared public work in progress at http://bit.ly/DevOpsAudit.

A special thank you to Gene Kim and all those in the space who welcome everyone with a passion and desire to be a part of something great.


James DeLuccia



How to do DevOps – with security not as a bottle neck

As in any good morning, I read a nice article written by George Hulme that got me thinking on this topic; that lead to a discussion with colleagues in the Atlanta office, and resulted in me drawing crazy diagrams on my ipad trying to explain sequencing. Below I share my initial thoughts and diagrams for consumption and critique to improve the idea.

Problem StatementIs Security a bottleneck to development and likely more so in a continuous delivery culture?

Traditional development cycles look like this …

  1. A massive amount of innovation and effort occurs by developers
  2. Once everything works to spec, it is sent to security for release to Ops
  3. In most cases security “fights” and in a few cases fails the release to ask developers to patch (patching itself implies not a real solution but a fix and not a solution), and then
  4. a final push through security to Ops
There are many problems here, but to tackle the first myth – security here is a bottleneck, because that is how it is structurally placed in the development cycle.
On a time (man days; duration; level of work) basis, security is barely even present on the product develop to deploy timeline – this is akin to thinking that man has been on Earth for a long time, but is a mistake when taken relative to the creation of the planet.. but I digress
Solution In a continuous develop environment – iterate security cycles

As Mr. Hulme highlighted in the article, integration of information security with development through automation will certainly help scale #infosec tasks, but there is more. Integrate through rapid iterations and a feedback (note the attempt at coloring of the feedbacks by infosec & ops, joined and consistent with the in-flight development areas)

While high level, I find that as I work with leadership within organizations – clearly communicating and breaking out the benefits to their security posture; ability to hold market launch dates, and clarity for technology attestations is equally as important as the code base itself. Awareness and comprehension of the heavy work being done by developers, security, Ops, and compliance audit teams allows for leadership to provide appropriate funding, resources, governance, monitoring, and timelines (time is always the greatest gift).

How have I developed my viewpoint?

I have been spending an increasing amount of time these past few years working with service provider organizations and global F100. The common thread I am finding is the acceleration and dependecy of third party providers (Cloud, BPO, integrated operators, etc..) and in the process have had an interesting role with continuous delivery and high deploy partners. Specifically, my teams have audited and implemented global security compliance programs of those running these high deploy environments, and sought to establish a level of assurance (from the audit public accounting perspective) and security (actually being secure and better sustainable operations).




What do major developments in big data, cloud, mobile, and social media mean? A CISO perspective..

Screen Shot 2013-02-26 at 6.52.56 PM

Tuesday afternoon the CISO-T18 – Mega-Trends in Information Risk Management for 2013 and Beyond: CISO Views session as presented focused on the results of a survey sponsored by RSA (link below).  It provided a back drop for some good conversation, but more so it gave me a nice environment to elaborate on some personal observations and ideas.  The first tweet I sent, hammered the main slide:

“Major developments with Big Data, Cloud, Mobile, and Social media” – the context and reality here is cavernous.. “

My analysis and near-random break down of this tweet are as follows with quotes pulled from the panel.

First off – be aware that these key phrases / buzz words mean different things to different departments and from each level (strategic executives through tactical teams). Big Data analytics may not be a backend operational pursuit, but a revenue generating front end activity (such as executed by WalMart). These different instantiations are likely happening at different levels with varied visibility across the organization.

Owning” the IT infrastructure is not a control to prevent the different groups from launching to these other ‘Major developments’.

The cost effectiveness of the platforms designed to serve businesses (i.e., Heroku, Puppet Labs, AWS, etc…) is what is defining the new cost structure. CIO and CISO must

>The cloud is not cheaper if it does have any controls. This creates a risk of the data being lost due to “no controls” – highlighted by Melanie from the panel.  <– I don’t believe this statement is generally true and generally FUD.

Specifically – There is a service level expectation by cloud service providers to compensate for the lack of audit ability those “controls”. There are motions to provide a level of assurance to these cloud providers beyond the ancient method established through ‘right to audit‘.

A method of approaching these challenging trends, specifically Big Data, below as highlighted by one of the CISO (apologies missed his name) w/ my additions:

  • Data flow mapping is a key to providing efficient and positive ‘build it’ product development. It helps understand what matters (to support and have it operational), but also see if anything is breaking as a result.
  • Breaking = violating a contract, breaking a compliance requirement, or negatively effecting other systems and user requirements.

Getting things Done – the CISO 

Two observations impacting the CISO and information technology organization include:

  1. The Board is starting to become aware and seeking to see how information security is woven within ERM
  2. Budgets are not getting bigger, and likely shrinking due to expectations of productivity gains / efficiency / cloud / etc…

Rationalization on direction, controls, security responses, must be be fast for making decisions and executing…

Your ability to get things done has little do with YOU doing things, but getting others to do things. Enabling, partnering, and teaming is what makes the business move. CIO and CISO must create positive build-it inertia.

Support and partner with the “middle management” the API of the business if you will.

  • We to often focus on “getting to the board” and deploying / securing the “end points” .. Those end points are the USERS and between them and the Board are your API to achieving your personal objectives.

Vendor Management vs procurement of yester-year

Acquiring the technology and services must be done through a renewed and redeveloped vendor management program. The current procurement team’s competencies are inadequate and lacking the toolsets to ensure these providers are meeting the existing threats. To be a risk adaptive organization you must tackle these vendors with renewed. Buying the cheapest parts and service today does not mean what it meant 10 years ago. Today the copied Cisco router alternative that was reverse engineered lacks an impressive amount of problems immediately after acquisition. Buying is easy – it is the operational continuance that is difficult. This is highlighted by the 10,000+ vulnerabilities that exist with networked devices that will never be updated within corporations that must have their risks mitigated, at a very high and constant cost.

Panel referenced the following report:

Thank you to the panel for helping create a space to think and seek answers, or at least more questions!

James DeLuccia IV

Top 3 attributes for businesses to benefit from Data Analytics – an Information Security & Business process perspective

Screen Shot 2013-01-30 at 4.08.18 PMBig Data introduces an opportunity that organizations see when merging silo product operations together forming a service layer or an enhanced hybrid product. Big Data also requires exceptional enterprise intelligence from the perspective of establishing the scaffolding for enterprise grwoth. That scaffolding requires advanced information technology system and business process matrix visibility.  My thesis … let me elaborate below on a single thread here given this is a subject I have been developing on recently…

In order for Big Data to work it requires abundant access to systems, data repositories, and the merging and tweaking of data beyond original data owner expectations or comprehension. The enterprise that balances the advantage of Big Data analytics with superior scaffolding will appreciate higher run rates and profitability without unfunded cost centers and above trend OpEx generally. The opportunity of Big Data without this business intelligence will be squandered and the benefits not realized as a direct result.

The CIO has this ownership and it is the purview of the Audit Committee to ensure that these risks are understood and tackled. The Board of Directors have proven to value equally the aggressiveness of Data Analytics with the ongoing revaluation of the risk tolerance and acceptance points of the business. As one can imagine, this is a familiar yet distinct activity within the executive structure, but three key attributes / activities that indicate a successful approach are as follows:

  1. Vertical awareness – product awareness, strategy, and full line of sight for each major revenue center
  2. Scrum topical teams – risk assessments and activities linked to the product market research initiatives
  3. Senior strategy alignment – what does the Board seek in this DA movement; What does the CEO/CIO envision on these product expansions; What is the audit committee observations (meaning that they must have visibility and mindfulness to the impact)

Think Big Data is not huge business? … consider these figures:

  • Gartner: Big Data Market is Worth $3.7 Trillion, Generating Over 4 Million Jobs by 2015 – article
  • Good short presentation on value of pattern based strategies, by Gartner
  • $29B will be spent on big data throughout 2012 by IT departments.  Of this figure (Forbes)

Or a classic business case example:

“The cornerstone of his [Sam Walton’s] company’s success ultimately lay in selling goods at the lowest possible price, something he was able to do by pushing aside the middlemen and directly haggling with manufacturers to bring costs down. The idea to “buy it low, stack it high, and sell it cheap” became a sustainable business model largely because Walton, at the behest of David Glass, his eventual successor, heavily invested in software that could track consumer behavior in real time from the bar codes read at Wal-Mart’s checkout counters.

“He shared the real-time data with suppliers to create partnerships that allowed Wal-Mart to exert significant pressure on manufacturers to improve their productivity and become ever more efficient. As Wal-Mart’s influence grew, so did its power to nearly dictate the price, volume, delivery, packaging, and quality of many of its suppliers’ products. The upshot: Walton flipped the supplier-retailer relationship upside down.”Changing The Industry Balance of Power

A good (no paywall) article on Forbes here breaks down the IT spent related directly to Big Data and compares against prior years up to 2012 & by industry.  

Also check out this MIT Sloan article co-developed with IBM entitled Big Data, Analytics and the path from Insight to Value  – most interesting for me was page 23 relating to Analytics trumping intuition.  This relates to EVERY business process, product, sales opportunity, accounting, fraud detection, compliance initiative, security analytics, defense and response capabilities, power management, etc …  A worthwhile read for each executive.

Think strategically act vertically and influence horizontally – scale!

James DeLuccia IV

*See me speak at RSA 2013 on the topic – Passwords are Dead

Connected Systems of PCI – Identifying; Securing; Attesting

The payment card industry standard articulates very prescriptively what should be done for all system components that are within the payment card process.  An area of usual confusion is the depth of abstraction that should be applied to the “connected system” element of the standard.  Specifically, the standard states the following:

The PCI DSS security requirements apply to all system components. In the context of PCI DSS, “system components” are defined as any network component, server, or application that is included in or connected to the cardholder data environment. ―”System components” also include any virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors. The cardholder data environment is comprised of people, processes and technology that store, process or transmit cardholder data or sensitive authentication data. Network components include but are not limited to firewalls, switches, routers, wireless access points, network appliances, and other security appliances. Server types include, but are not limited to the following: web, application, database, authentication, mail, proxy, network time protocol (NTP), and domain name server (DNS). Applications include all purchased and custom applications, including internal and external (for example, Internet) applications.

– PCI DSS 2.0 page 10

To simplify – there are the system components that are involved with the payment card process, and then there are the supporting systems (connected systems) that also are in scope of PCI DSS.  An example would be the patch server where the in-scope PCI system is receiving patches (but there are dozens).

So a rule of thumb on scope most offered in the industry is:

If you can digitally communicate with the system it is a connected system (this includes UDP, TCP, etc …) it is in scope.  

A nice write up by Jeff Lowder referring to specifically the security system components can be found here written in 2010.

A Korzybski abstraction problem:

How many levels of abstraction should one undertake?  Meaning – should that same patch server then be examined to see what systems are connecting to it and thus also be included in the ‘connected system’ web?

The answer here is generally no – the abstraction is only one level deep.  That doesn’t mean best practice risk and security practices evaporate, so no leaving that server unpatched on the internet or anything.

What Requirements of PCI DSS should be applied to these ‘connected systems’?

The standard makes it clear in the beginning “The PCI DSS security requirements apply to all…”  So, every PCI control applies to the connected system under discussion and identified through the abstraction of the services supporting the CHD environment itself.  Limitations can be applied to the core system components that make up this “connected system”.  Such as in the virtualization space, the hypervisor risks and controls are differentially applied from the entire standard.  These exceptions from fully applying the PCI standard directly to the connected system must be limited and done with clear awareness.  [Updated: All requirements should be considered … each QSA is different but addressing the risk with an eye towards compliance is the best and safest bet.  Shopping for someone to accept a state of controls is a painful road]

An “Open PCI DSS Scoping Toolkit“(pdf)  published on August 24, 2012 is available that provides an excellent structure in methodically determining scope and the controls that would be applicable.  While not a product of the PCI SSC or a technical group – there is good content here that should be carefully considered as part of every security and compliance strategy.  Thanks to Eric Brothers for the note! [Updated 12/5/2012]

Another good write up is offered here where a good articulation on the two factor authentication exception; file integrity monitoring exception, and a few other practices are nicely elaborated by Andrew Plato (Plus check out the comment threads.. very informative though proceed with caution as this is one QSA interpretation).  Definitely worth a review, though sadly after a review there appears no other elaborations within the PCI SSC on this topic.

This write-up is an exploratory effort to seek clarity by consolidating thoughts of leading individuals in the payment card space and client environment realities.  Any additional insights or perspectives you have are welcomed and requested in the comments below!  I’ll update as anything is shared.


James DeLuccia

How to improve the maturity of your security program – Learn from mistakes made others!

Organizations struggle with a complex information security compliance program needs placed upon the organization.  Mature organizations participate in regular self review and improvement activities on an annual basis, and in some organizations as regular as monthly.  These organizations are fortunate to have larger security teams that reflect the global (think Fortune 500) deployment of assets.  This network provides an immensely valuable feedback loop on the following, among many others:

  • What are effective practices
  • What policies are great for the business, and where are exceptions being raised frequently that may indicate unknown business requirements
  • Attack patterns and weaknesses in the security program based on statistical review of events within the business
  • Where are programs meeting customer / client requirements – based on sales attributions and audit findings, respectively.

For organizations of this sophistication and those of all other sizes there is an additional input that raises the overall efficiency and effectiveness of the security compliance program.  That is through a self comparison against public data.  Specifically data released by government audits, intelligence committee reports, and guidances / complaints issued by government enforcement agencies.  These are immensely helpful in providing businesses across all sectors insights into security threats, trends, shifting perceptions of “due care”, and areas where risks are ebbing and flowing.

A simple set that an organization may consider includes:

The takeaway here is that every organization should regularly identify these sources, consolidate them in a manner that can be analyzed, and develop an intelligence report on any gaps in practice and security controls as documented by these organizations.  These apply to every organization and not simply those in the government space.  The process of careful analysis against the organization’s strategy combined with the rote knowledge of the practitioners internally can support realizing these benefits.

The genesis of this article was inspired through close workings with Fortune 50 organizations and developing leading global security programs.  A nice article illuminating this and other opportunities for improvements to security compliance programs is by Adam Shostack, in “The evolution of information security“.  A very good read.

Thoughts .. and expansions of idea are always welcome!

James DeLuccia IV