Linux may be the most widely deployed operating system in the world, and yet, despite its massive presence, it remains perhaps the most enigmatic of the big three.
The reasons for that are many. For one, Linux is actually numerous operating systems spread across countless distributions and versions, all serving slightly different needs. To complicate that further, Linux systems may look and feel like a regular laptop or take the form of physical or virtual servers, virtual machines (VM), or containers, to name just a handful of computer-like things. Just as important, the vast majority of Linux systems are humming along in the background, quietly powering the software, services, and applications we rely on. Linux is ubiquitous, running much of our critical infrastructure, but, unlike with Windows and macOS, most humans—even those of us in tech circles—never interact with it.
Pulling back the curtain
While Linux is freely available to anyone who wants to use it, you can’t really just buy a Linux computer (for the most part). You have to seek out the distribution you want and then you have to install it yourself, which is a high enough barrier to preclude most users from ever using the operating system in the first place.
Red Canary has produced a lot of great research detailing Linux threats and security controls, including everything from new attack methods to open source testing and validation tools. However, much of it’s been esoteric and exclusive, mainly useful to the subset of IT and security professionals who are already very familiar with Linux security and administration.
The last thing we want are gaps in security practices or knowledge just due to an unfamiliar operating system. This article presents a high-level overview of Linux in its many forms to a broader audience. Our hope is that this can serve as a foundational resource moving forward, as we attempt to describe the many forms of Linux worth securing, the types of threats you might expect to target them, and the artifacts and optics you might leverage to observe those threats.
Linux is ubiquitous, running much of our critical infrastructure. But most humans—even those of us in tech circles—never interact with it.
What is Linux?
Linux is an operating system, like Windows 10 and macOS, used to power everything from day-to-day computing needs all the way to the largest companies and most widely used software in the world. Linux does many of the same things that Windows and macOS do, including running programs and software, handling major business applications, and supporting the latest games (of course).
Linux for business has risen in popularity over the last decade, and enterprises use it frequently to support cloud infrastructure and software-as-a-service (SaaS) applications. While it commands a fleetingly small percentage of the desktop market, Linux dominates cloud and production systems.
Similarities aside, Linux has a lexicon of jargon all its own. Before we can move on, there are some key terms and concepts we need to define or explain.
A distribution (or distro) is effectively a variety of Linux, usually customized to satisfy a specific use case, like hosting high-volume databases, running custom software, or supporting the world’s largest websites and applications. Distributions include common names like Ubuntu, CentOS, or Red Hat Enterprise Linux, and utterly uncommon ones like EasyOS. Unlike Windows and macOS, there are hundreds of different Linux distributions, each with their own unique features, customizations, and maintainers.
Why are there so many distributions? Linux uniquely is open source, meaning that anyone can change and modify the core Linux code for their own purposes, as long as they respect the license. That means individuals and businesses can change certain characteristics of Linux or introduce entirely new distributions to suit their needs. Common licenses include GPL or BSD, and specifically document how, when, or even if a user can use, modify, or sell Linux distributions or apps.
The kernel is the brains of the operation in Linux, handling all of the core operations across your operating system, including communication with other services and process management.
With each update to the kernel, a new kernel version is created. These take the form of normal numeric versions, such as 1.2.34. Different versions of each Linux distribution rely on different kernel versions, and each kernel version has its own attributes and features.
Administrators use shells and commands to manage Linux. Commands tell Linux what to do, and can be used for everything from starting and stopping programs and scripts to managing users. Shells are programs that take commands and run them, with shells like
tcsh commonly used to manage Linux systems.
Linux is deployed in a wide range of ways. Certain Linux distributions, such as Ubuntu, can be designed to run best on laptops or desktops and look and feel like other operating systems. Other distributions, such as CentOS and Red Hat Enterprise Linux, are specifically designed to run databases, production applications, and other software widely used in cloud applications. Where and how Linux is deployed depends on your unique situation and needs, but it can be deployed across a number of physical and virtual environments. Linux deployments can be broadly organized into the following categories (this list is not exhaustive):
- Servers are machines designed to run tasks and applications and “serve” traffic and data. Servers generally provide both storage and computing power. Servers are usually managed with commands and shells, but can also be managed through cloud provider dashboards or virtual desktops.
- Servers can be physical or virtual. Physical servers—sometimes called “on-premises/on-prem” or “bare metal”—are actual real machines that you can feel and touch. They can be on-site or in a datacenter, and are managed and configured in the same ways as other Linux deployments.
- Virtual servers look and feel like physical servers, but are actually pieces of software (sometimes called instances) that pretend to be a real server. In AWS and other cloud providers, you can choose which size of virtual server (CPU and memory) you want to use, and the software allocates you that resource in the cloud provider’s infrastructure alongside other companies’ applications. Each company is separated from each other to prevent data leakage, but you are responsible for maintaining security, installing patches, and monitoring for threats. This method is popular with cloud providers because it allows them to slice up one very powerful physical server into smaller virtual servers with their own applications and resources.
- Desktop Linux is the closest comparison to regular Windows and macOS laptops and desktops you likely use today. These Linux distributions and systems are designed to provide users a normal space to work, run applications, and browse the web. They are often designed to look and feel like the Windows and macOS desktops you are used to, but can also be highly customized.
- Desktop virtual machines act as a virtual Linux computer you can run on any laptop or desktop. Most often, these are run on top of servers and can be created or destroyed frequently. VMs can be used to see how applications run in different environments, to do work across multiple operating systems and versions, or to reduce the number of physical devices companies need to keep on hand. Security teams often use VMs to analyze malware and test their security controls.
- Containers are designed to handle specific tasks like authentication or data routing, and break apart larger applications into small components. This allows developers to use only the containers they need, saving on cost and complexity. Containers can be run by themselves or deployed out in clusters to run web applications, do a specific job, or allow infrastructure to scale up and down quickly based on demand. A container might execute a task like signing in users, and includes all of the features, dependencies, and configuration it needs to do this job.
- Services like Kubernetes help manage and orchestrate container deployments to make it easier to run and scale apps with containers. Containers and Kubernetes are becoming increasingly popular due to their ability to easily scale and save money when compared to traditional infrastructure. While containers can be run without Kubernetes or other management services, development teams will often use both to reduce the complexity of their software deployment process.
- Docker is software that helps containerized applications designed between multiple developers or computers work anywhere. Traditionally, developers design an app or feature that initially “only works on their machine,” acting differently on other servers or computers. With Docker, developers can create their applications to run the same on a much wider range of machines and configurations, reducing development time and bugs. Docker is used with containers and often alongside services like Kubernetes to make development on those platforms more predictable and less complex.
While it commands a fleetingly small percentage of the desktop market, Linux dominates cloud and production systems.
Why do companies use Linux?
Linux provides a number of benefits that make it an excellent choice for a wide range of applications. From managing business critical applications like databases and access management tools to supporting massively scaled software, Linux is highly adaptable, customizable, and performant.
Linux is highly configurable by anyone who wants to tweak and tune their operating system. Having a configurable operating system has its perks, allowing developers or administrators to enable or disable certain features on demand, lock down their operating system to only allow specific tasks or access, offer an easy to use desktop experience, or even avoid potentially disruptive updates. In cloud Linux deployments, this same configurability allows administrators to limit their Linux servers to only certain key features and access, saving money and reducing attack surface. While the various Windows operating systems and macOS are designed, to some degree, to be general purpose, individual Linux builds—and even entire distributions—can be entirely bespoke.
Start with a blank slate
Some Linux distributions often strip away a lot of bloat, like unused programs and applications or even the entire interface. While these tools are great when working on a laptop, they waste resources and add complexity to a server. While you may be stuck configuring away certain programs and features on Windows and managing updates that add back consumer features, you can rest easy knowing that your particular Linux build includes exactly what you need and nothing more.
Because Linux is open source, most distributions are free to use. Unlike paid operating systems like Windows, administrators have the freedom to use Linux as widely as they like. For companies that prefer an operating system with more checks and balances on security and updates, paid Linux versions such as Red Hat Enterprise Linux are also available.
More scrutiny, faster updates
No operating system is totally secure, but Linux’s open-source code means that bugs and security issues get attention from the community fast. And because many Linux distributions share a lot of the same code, updates and patches are widely available quickly. Certain Linux distributions, like Ubuntu, also package up updates and packages monthly, so that you can have a more regular update window.