Tag Archives: analysis

Attribution & Intent challenges: Comparing Regin module 50251 and “Qwerty” keylogger

Kaspersky Labs (a pretty wicked good set of researchers) published an analysis on the Snowden shared source code and found it identical in part to a piece of malware known as Regin. Regin has been in the digital space for nearly 10 years and has been attributed to a number of infected systems globally.

I would encourage everyone to read and understand the analysis as it is quite thorough and interesting .. go ahead, I’ll wait .. Comparing the Regin module 50251 and the “Qwerty” keylogger – Securelist.

While I cannot speak to the course and reason behind this tool, beyond the obvious conjectures, I would stress one critical point.  Attribution and intent.

Attribution is hard and of little value

As we find with other digital attacks, attribution is very difficult and I often tell clients to not focus on that as a basis for sanity and response. This is obvious in the difficulty in attributing such attacks, but also the problems with incorrectly making such assertions. I.e., JP Morgan’s “Russian attack on the bank due to their activities” during Ukraine incident was in fact a breach due to simple human error on configuring a server.

Intent

We as the observers do not know the intent of the operatives with the malware. In this case with the NSA we have identified malware in various locations, but as we all know … malware code spreads pretty freely without much direction. The concept that one system was infected unintentionally or without purpose from the operators is pretty high.

This comes to the forefront with our own internal analysis of attacks and breaches in our corporate environments. We must seek out all of the possible vectors, and not allow our bias or evidence on hand sway us incorrectly.

Spiegel.de article on Kaspersky report and other thoughts

Thoughts?

James

Overcoming team, enterprise, and self analysis paralysis – Battlefield Leadership series

The Only Thing Wrong with Nothing Happening is the Fact that Nothing is Happening

A leader must be effective in the following tasks:

  • Invigorating a unit with disparate needs.
  • Managing time. There is always something a leader can do. Always.
  • Self confidence. Leaders must trust their instincts and previous experiences.
  • Innovation. When confronted with a situation different than planned, a leader needs to devise a new plan of attack.

The battle at Utah Beach demonstrates this with Roosevelt’s commands upon landing in the first wave. Roosevelt succeeded by leading the troops and deciding on the next actions quickly according to factors of the time.

Port en Bessin

Business Reflections…

As a leader of self, family, and business one must adopt these principles. The ability to positively effect these three factions is paramount to success. To succeed in life, one must adopt the following capabilities:

  1. Recognition of scenarios.
  2. Energy to execute.
  3. No hesitation, no analysis; paralysis avoidance.
  4. Foresight, having vision on the second step and continuing forward.
  5. Escaping the echo chamber of the mind and protocol.

 

What is Battlefield Leadership and what is this series about … 

As part of my pursuit to learn and grow, I sought out the excellent management training team at Battlefield Leadership. I am professionally leveraging this across multi-million dollar projects I am overseeing (currently I am the lead executive building global compliance and security programs specifically in the online services / cloud leader space). Personally I am bringing these lessons to bear within my pursuits to cross the chasm. To often I see brilliant technical individuals fail to communicate to very smart business leaders and to the common person on the street. My new book – How Not to be hacked seeks to be a first step in bringing deep information security practices beyond the technologist.

Most exciting the Battlefield group for this training placed it in Normandy France. This allowed for senior executives to be trained in a setting where serious decisions were placed by both sides, and each provided a lesson. This series represents my notes (that I could take down) and takeaways. I share to continue the conversation with those great individuals I met, and with the larger community.

Kind regards,

James

RSA 2014 – 2 themes from Tuesday

A fresh post in a long while ..

So, after writing for clients and my research being all consuming this past year I am re-focusing time in my day to share observations and thoughts. Why? Quite simply I learn more when I write; share, and get feedback then living in an echo chamber. How will this benefit the world/you.. simple, you will share in the knowledge I gain from sweat and toil and learn through the same iteration cycle as I. I also will begin focusing my posts on my dedicated portal for such topics and (attempt) to limit my writings here to on-topic. I hope you will continue to join me on the new(er) site and the other media platforms.

Also, I am trying to aim for a high iteration format instead of the long form of old. Meaning, shorter (I hope) posts that are succinct on ideas without the typical pre/post writings that are common in most write-ups. My ask, please share, challenge, and seek to understand my perspective – as I will do for you.

Onward then …

Today is RSA day and 2 themes that are evident and of most importance based on several large client discussions; analyst discussions; and a few researchers I had the privilelege of speaking with today:

  1. Communicating the WHY is of paramount importance today (WHY are we spending security budgets on X assets? WHY are our practices for managing enablement between development, operations, and security out of sync? Etc..)
  2. Passive Resistance (my phrase, but after a day of hearing about NSA, RSA, Crypto architects disowning responsibility for operational deployment, and “enable” privacy, security this is where I landed) is the idea of persons and organizations being asked to respond to these threats in a manner that impings their capabilities. There are many problems with this stated position, but I shall leave that for another day and your own pondering

Businesses must address #1 and be extremely cautious with #2, and #2 will be a heavy discussion during my RSA session on Thursday for all that are present. If you are unable to attend, I will as usual post my work and research in note form online. Looking forward to learning and expanding my thinking with you.

Best,

 

James

 

Verizon Data Breach Report 2009: Exposed

The fine folks at Verizon Incident Response last week put out their fine, annual, report on evidence and trends based on their forensic efforts.  This year they focus the statistics on approximately 90 caseloads that they served over the calendar year.  The report is quite good and at a short 52 pages a worthwhile read cover to cover.  Readers of this site know I won’t print their report here, but will highlight the areas that jumped out.

These tidbits and the report of discussion should be employed for applying true numbers to your security calculations; a wake up call for a ‘Back to the Basics’ call to arms; and a magnifying glass that what is often published (data breach reports for example) does not always tell the whole tale.

“Target of Choice or Target of Opportunity. If the former, expect and prepare for determined and sophisticated attacks. If the latter, minimize the opportunities presented so as to become less of a beacon for attack. At the very least, make sure your beacon shines less brightly than everyone else’s.”

An interesting highlight by the authors of the report to minimize the perceived value and increase the perceived difficulty of compromising an enterprise.

Among the numerous thoughtful recommendations provided within the report the authors highlight an area that is of extreme importance to me, and deserves to be highlighted:

“Control data with transaction zones: Based on data discovery and classification processes, organizations should separate different areas of risk into transaction zones. These zones allow for more comprehensive control implementations to include but not be limited to stronger access control, logging, monitoring, and alerting. ”

“81 percent of organizations subject to PCI DSS had not been found compliant prior to the breach.”

– a surprising figure, but one that has been pushed around in the media.  The reality of this quote is that these business were NOT be found to be compliant prior to the breach, meaning they were not compliant with the standard.  This leaves 19% of proved compliant organizations to have been either mistakenly approved, approved and fallen off the wagon, they were compliant and the attack vector was one not within the scope of the PCI controls, or finally they were compliant and the attackers were good enough to override said controls.

Vulnerability Patching vs. Configuration Setup/Mgmt:  “2008 continued a downward trend in attacks that exploit patchable vulnerabilities versus those that exploit configuration weaknesses or functionality. Only six confirmed breaches resulted from an attack exploiting a patchable vulnerability.”  “…focus on coverage and consistency…” is more important, effective, and statistically relevant.

“…four of 10 hacking-related breaches, an attacker gained unauthorized access to the victim via one of the many types of remote access and management software”

Continued support to protect the systems that are designed to provide access remotely as they tend to provide a simple avenue into the business.

“During 2008, malware was involved in over one-third of the cases investigated and contributed to nine out of 10 of all records breached. In years past, malware was generally delivered in the form of self-replicating email viruses and network worms. “

Custom code is increasing and the old standbys of Anti-Virus and signature based solutions are falling increasingly behind.  Additional controls are necessary both at the preventive and detective areas of a business risk program.

“Most application vendors do not encrypt data in memory and for years have considered RAM to be safe. With the advent of malware capable of parsing a system’s RAM for sensitive information in real-time, however, this has become a soft-spot in the data security armor. “

This was an interesting highlight that brought forward an emerging area of assault and one that is becoming more of a concern.  Capturing data while in transit – any channel – is now at risk as other aspects of the business functions and digital processing become more sophisticated and secured.  An interesting additional reality is as complexity increases within business technology environments attackers will seek areas that tend to be more stable and consistent to allow for automated and subtle means of intrusion, capture, and release of targeted information.

“85 percent of the 285 million records breached in the year were harvested by custom-created malware”

The relative difficulty of attacks leading to data compromise:

  • “Category: None, No special skills or resources required. The average user could have done it.
  • Category: low: Basic methods, no customization, and/or low resources required. Automated tools and script kiddies.”
  • Combined NO skill and LOW skill makes up 52% of all breaches; however “…few highly difficult attacks compromised 95 percent of the 285 million records…”  The end result – no organization is immune to the use of these tools and it is financially and technologically feasible to attack even single IP address businesses across the globe.

“Targeted attacks (The victim was first chosen as the target and then the attacker(s) determined a way to exploit them.) are at a five-year high and accounted for 90 percent of the total records compromised. “

As always, review the data – make your own conclusions – and plan accordingly.  The costs and impacts of these attacks on businesses ($14.6 billion for new account fraud), to individuals ($1.8 Billion to U.S. citizens), and to the world at large (the total amount of fraud in 2008 is the same as the aggregate sum of 79 industrialized nations).

I spoke at RSA 2009 this year on PCI and Beyond – great crowd and a post-CON post will follow soon.
Best regards,

James DeLuccia