Grover Mewborn is a senior security engineer at real-estate intelligence firm CoStar. This is the first of a two-part series on how they’re using Red Canary’s open source tool Surveyor.
Surveyor is an open source utility for quickly determining what is normal in an environment and what isn’t. The tool gives security and IT teams the ability to search across their endpoints to take inventory of software usage, validate threats, and determine the frequency of just about anything within an organization.
I became aware of Surveyor during a presentation at Carbon Black Connect 2020. Surveyor is a fairly simple Python script that essentially reads process attributes from definitions files, queries endpoint detection and response (EDR) tools in search of those attributes, and provides results in an easy-to-consume CSV format. Surveyor contains a bunch of pre-built definitions files, but you can easily build your own custom definitions as well.
I saw immediate value. While the script and its functionality are by no means the HAL9000 of cyber tools, Surveyor offered a simple and efficient way to collect data in my environment and start analyzing it immediately.
That last part is really important: analyzing data. I’m a security engineer, not a developer, quality assurance analyst, or data scientist. I’m defending a corporate environment and need to be delivering measurable security results, not continuously developing and troubleshooting the shoddy code I write.
However, despite its immediate appeal, Surveyor had one minor set back for me: it is command-line interface (CLI) based. Let’s be clear: I’m not opposed to working in the CLI. It’s actually quite fun, and, as an example, scripting with Tshark, the Wireshark CLI tool, offers similarly amazing opportunities for time saving and consistent analysis of large PCAPs.
But I have a small team. There’s about six of us, and we’re all consumed by different aspects of our security architecture. In addition, we’re not all hackers slinging code and awking and seding our way to security glory. I decided that in order to give Surveyor a chance in our environment, I’d need to develop a more efficient way of interacting with Carbon Black beyond what Red Canary had already done.
A little background
For the purpose of understanding the use cases I’m about to describe, all you really need to know (for now) is that we built a custom implementation of Surveyor in AWS that automates, or vastly simplifies, the process for running ad hoc queries. In part two of this blog series, I’m going to show you exactly how we built this, so you can go and develop your own version.
So what value do I get from Surveyor?
What value? My friend… Imagine:
Let’s say our security operations team receives notification from our SIEM of a filename associated with ransomware on a OneDrive share. The person triaging the notification isn’t familiar with Carbon Black. However, they know the name of the file they’re interested in finding and are comfortable editing a text file. So, they add the unique filename from the alert into our Surveyor definition file and upload it to the S3 bucket that feeds our custom Surveyor implementation. If after a few minutes our query returns results showing no match for the ransomware filename across our environment, then we can all breathe easy.
Maybe that’s not a terribly exciting example, but, as a security engineer, exciting isn’t really something I crave.
Here’s another non-exciting example that management really liked: We’re migrating our VPN solution and trying to determine connection quality by issuing surveys to employees, but it’s hard to know who’s actually using which VPN client or what its deployment status looks like. We created a list of VPN clients that we know are being used across our environment and used our custom Surveyor implementation to enumerate which users were using what VPN client across the entire organization. Within a few minutes we had an easy-to-read and easy-to-access CSV file in an S3 bucket.
Alert validation… again
Kicking this up a notch, we use Proofpoint Threat Response, which has a minimal SOAR component to it. By modifying an existing script that detonates malicious URLs in our sandbox, we can create a definition file for Surveyor that looks for that same URL in command lines or for network connections to that domain from anywhere in our environment. In this case, when we receive malicious email alerts for review, we’ve already searched our environment for the URL. Thus, we’ve reduced the mean time to resolution for that alert—whether that’s to close it out as benign because no IOCs were observed in the environment, or because we’re already a step further down our response plan thanks to simple automation.
We’re also using our Surveyor implementation to perform continual baselining across our environment. The most obvious reason for using Surveyor is to figure out what’s running in an environment and who’s running it. However, things change every day, so you can’t just check your baseline once and pat yourself on the back. You have to do it periodically to make sure that nothing out of the ordinary has popped up. In a way, Surveyor serves as an anomaly detection mechanism for software usage across our business.
As you can see, we’re getting a lot of bang for our buck with Surveyor, in part because it’s free, but also in part because we’ve found a bunch of novel ways to use it. Further, the AWS implementation makes it incredibly easy to use, which allows basically anyone on our team to run queries against Carbon Black whenever they need.
That said, while using our Surveyor build is easy, developing it was not. In the second part of this blog series, I’ll explain exactly what we changed in the Surveyor code, how we deploy this custom Surveyor implementation into AWS, and how we connect it to the Carbon Black API.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.