Thursday, December 1, 2016

Top 10 Rock and Roll Cybersecurity Predictions for 2017


It's that time of year again. Time for information security predictions for 2017. This year, we have an interesting twist on predictions by tying them to classic rock lyrics. It's interesting how prescient the lyrics are.

Monday, November 7, 2016

Hacking the Elections

Quick key take aways from Hacking the Elections by Ian Gray

-- The U.S. election landscape is made up of approximately 9,000 different state and local jurisdictions, providing a patchwork of laws, standards, processes, and voting machines. This environment is a formidable challenge to any actor -- nation-state or not -- who seeks to substantially influence or alter the outcome of an election. Doing so would require mastering a large number of these disparate cyber environments and finding a multitude of ways to manipulate them. An operation of this size would require vast resources over a multi-year period -- an operation that would likely be detected and countered before it could come to fruition.

-- WikiLeaks founder Julian Assange continues to claim objectivity and transparency in his reporting; however, recent events have shown that WikiLeaks may be a pawn -- witting or unwitting -- that has been leveraged by the Russian government as an outlet for stolen information damaging to the Democratic National Party

-- While Guccifer 2.0’s sources are debatable, the hacker has indeed been effective in launching an information and propaganda campaign that has, at least to some degree, disrupted the track of the U.S. election.

-- Aside from the various political-influence campaigns, the FBI has confirmed that malicious actors have been scanning and probing state voter databases for vulnerabilities. Though the actors were operating on servers hosted by a Russian company, those attacks are not, for the moment, being attributed to an actual Russian state-sponsored campaign.

Click here to read the entire article.

Thursday, October 27, 2016

New Stats on Dyn DDoS Attack Size



Imperva Releases More Information on the Dyn Attack

Ofer Gayer, product manager at Imperva for the Incapsula product line, explains:

“There is still quite a bit of speculation swirling on the size of the DDoS attack on Dyn last Friday. We know there were 100,000 Mirai botnet nodes – which is not especially large in our experience. So, in our estimation, there are two likely causes. The attack may have been a high-volume attack – over 500 million packets per second – that overwhelmed the Dyn infrastructure. Or, the attack may have been relatively small – 50-100 million packets per second – and the attack itself was “amplified” by what is known as a retry storm from their millions of legitimate users, making the job of differentiating between good and bad traffic very hard.”
 
Additional Information:

Q. Is a 100,000-node botnet big?
A. Not really.  Example of a 180,000-node botnet mitigated  https://www.incapsula.com/blog/headless-browser-ddos.html

Q. Are DNS services especially vulnerable?
A. They do suffer from being open systems:

"Effective DDoS mitigation is synonymous with accurate traffic filtering. For that reason DNS amplification attacks are actually easier to deflate as all uninitiated DNS responses are highly suspect and could be filtered on-edge, without any impact on the regular traffic flow. For example, one could categorically drop all unexpected DNS responses to port 53.

However, this isn’t the case for seemingly legitimate DNS flood queries, which cannot be dismissed before they are individually processed at the server level.

With on-edge filtering bypassed, and the path to the server CPU cores laid wide open, DNS floods have the potential to bring down even the most resilient of networks. "


Q. How can companies prevent attacks on their DNS infrastructure?

Q. Is Mirai that sophisticated?

Q. Has the Incapsula network been hit with Mirai?

Q. What’s a big DDoS attack measured in million packets per second (Mpps)


Wednesday, October 26, 2016

Corero Warns of Powerful New DDoS Attack Vector with Potential for Terabit-Scale DDoS Events




New zero-day attack vector has significant amplification factor and could be used to enhance effectiveness of botnet tools used to launch recent attacks on Dyn, Krebs on Security and OVH
Marlborough, MA and London, UK – October 25, 2016 –  Corero Network Security today disclosed a significant new zero-day DDoS attack vector observed for the first time against its customers last week.  The new technique is an amplification attack, which utilizes the Lightweight Directory Access Protocol (LDAP). LDAP is one of the most widely used protocols for accessing username and password information in databases like Active Directory, which is integrated in most online servers. 

While Corero’s team of DDoS mitigation experts has so far only observed a handful of short but extremely powerful attacks against their protected customers originating from this vector; the technique has potential to inflict significant damage by leveraging an amplification factor seen at a peak of as much as 55x. Therefore, in terms of its potential scale, if combined with the Internet of Things botnet that was utilized in the recent 655 Gigabyte attack against Brian Krebs’s website, we could soon see new records broken in the DDoS attack landscape, with potential to reach tens of Terabits per second in size in the not too distant future.  The DDoS landscape has been extremely volatile in recent weeks, particularly with the release of the Mirai code and subsequent Mirai infected Internet of Things (IoT) devices, and we expect this trend to continue for the foreseeable future. 

Dave Larson, CTO/COO at Corero Network Security, explains: “This new vector may represent a substantial escalation in the already dangerous DDoS landscape, with potential for events that will make recent attacks that have been making headlines seem small by comparison. When combined with other methods, particularly IoT botnets, we could soon see attacks reaching previously unimaginable scale, with far-reaching impact. Terabit scale attacks could soon become a common reality and could significantly impact the availability of the Internet– at least degrading it in certain regions.” 

Reflection and Amplification Attacks
In this case, the attacker sends a simple query to a vulnerable reflector supporting the Connectionless LDAP service (CLDAP) and using address spoofing makes it appear to originate from the intended victim. The CLDAP service responds to the spoofed address, sending unwanted network traffic to the attacker’s intended target. 

Amplification techniques allow bad actors to intensify the size of their attacks, because the responses generated by the LDAP servers are much larger than the attacker’s queries. In this case, the LDAP service responses are capable of reaching very high bandwidth and we have seen an average amplification factor of 46x and a peak of 55x. 

Dave Larson explains: “LDAP is not the first, and will not be the last, protocol or service to be exploited in this fashion. Novel amplification attacks like this occur because there are so many open services on the Internet that will respond to spoofed record queries. However, a lot of these attacks could be eased by proper service provider hygiene, by correctly identifying spoofed IP addresses before these requests are admitted to the network. Specifically, following the best common practice, BCP 38, described in the Internet Engineering Task Force (IETF) RFC 2827, which describes router configurations that are designed to eliminate spoofed IP address usage by employing meaningful ingress filtering techniques, would reduce the overall problem of reflected DDoS by at least an order of magnitude.

“Today’s DDoS attacks are increasingly automated, meaning that attackers can switch vectors faster than any human can respond. The only effective defense against this type of DDoS attack vector requires automated mitigation techniques. Relying on out-of-band scrubbing DDoS protection to stop these attacks will cause significant collateral damage. Given the short duration and high volume attacks, legacy solutions simply cannot identify and properly mitigate in time to protect network availability.

Operational Auditing: Principles and Techniques for a Changing World

Internal auditors are expected to perform risk-based audits, but do so partially because they focus on financial and compliance risks at the expense of operational, strategic and technological ones. This limits their ability to evaluate critical risks and processes. Operational Auditing: Principles and Techniques for a Changing World by Hernan Murdock merges traditional internal audit concepts and practices with contemporary quality control methodologies, tips, tools and techniques. It helps internal auditors perform value-added operational audits that result in meaningful findings and useful recommendations to help organizations meet objectives and improve the perception of internal auditors as high-value contributors, appropriate change agents and trusted advisors.

Tuesday, October 25, 2016

Introduction to Behavioral Biometrics

New Directions in Behavioral Biometrics presents the concept of behavioral biometrics on the basis of some selected features like signature, keystroke dynamics, gait, and voice. This excerpt from the book provides a brief overview of behavioral biometrics.

Monday, October 24, 2016

Risk and Trust Assessment: Schemes for Cloud Services


Both risk and trust have been extensively studied in various contexts for hundreds of years. Risk management, and specifically risk assessment for IT, has also been a hot research topic for several decades. On the other hand, modeling risk and trust for cloud computing has attracted researchers only recently. This chapter from Cloud Computing Security: Foundations and Challenges provides a survey on cloud risk assessments made by various organizations, as well as risk and trust models developed for the cloud.