<— Back

Crawlector: A Threat Hunting Framework

Compromised websites can be used for drive-by-download attacks, water-hole attacks, social engineering, web skimming, ad injection, and hosting exploit kits. The volume of malicious traffic from such websites mandates an automated approach to finding threat intelligence quickly and efficiently. In this talk, we are presenting a new threat hunting framework called Crawlector (a combination of Crawler & Detector), signed for scanning websites for malicious objects, in a fully automated manner. Moreover, Crawlector supports online/offline scanning, spidering websites to discover additional links, Yara as a backend detection engine, digital certificate scanning, and querying URLhaus to find malicious URLs on the page, among others. The framework’s operations are highly customizable. To demonstrate the framework’s effectiveness and performance, we’ll highlight some interesting results from scanning the top 700k Alexa websites and top 100k WordPress sites. Furthermore, this talk will additionally address the design processes and decisions made during the development of the framework.

Crawlector features include:

  1. Supports spidering websites for findings additional links for scanning
  2. Integrates Yara as a backend engine for rule scanning
  3. Supports online and offline scanning
  4. Supports crawling for domains/sites digital certificate
  5. Supports querying URLhaus for finding malicious URLs on the page
  6. Supports querying the rating and category of every URL
  7. TLSH
  8. JARM Hash
  9. Supports expanding on a given site, by attempting to find all available TLDs and/or subdomains for the same domain
  10. This feature along with the rating and categorization, provides the capability to find scam/phishing/malicious domains for the original domain
  11. Saves scanned websites pages for later scanning (can be saved as a zip compressed)
  12. The entirety of the framework’s settings are controlled via a single customizable configuration file
  13. All scanning sessions are saved into a well-structured CSV file with a plethora of information about the website being scanned, in addition to information about the Yara rules that have triggered
  14. One executable
  15. Written in C++
  16. The framework and a transpiler that converts EKFiddle rules to Yara rules will be released on GitHub after the talk.

Mohamad Mokbel

Mohamad Mokbel is a senior security researcher at Trend Micro. He’s responsible for reverse engineering vulnerabilities and malware C&C communication protocols, among others, to write custom filters for TippingPoint NGIPS. Before joining Trend Micro, Mohamad worked for CIBC in the SoC, one of the top five banks in Canada as a senior information security consultant – investigator (L3) where he realized that experience in the operation field is extremely important to understanding the real sides of offence and defense. Before CIBC, Mohamad worked for TELUS Security Lab as a reverse engineer/malware researcher for about 5 years. He’s been doing reverse code engineering for the last 14 years. His research interests lie in the areas of reverse code engineering, malware research, intrusion detection/prevention systems, C++, compiler and software performance analysis, and exotic communication protocols. Mohamad holds an MSc. in Computer Science from the University of Windsor.