Building Cybersecurity for Multi-Domain Operations

06:25:2024

BY David Wagenheim

The Department of Defense is implementing Joint All Domain Command and Control (JADC2), an approach for developing the warfighting capability to sense, make sense, and act at all levels and phases of war, across all domains, and with partners, to deliver information advantage at the speed of relevance. In doing so, data from multiple sources needs to be collected, processed and distributed to various combatant commands rapidly. It is common to use cross domain solutions (CDSs) when needing to transfer data between multiple domains, whether they be different . . . 

  • branches of the military 
  • classification levels 
  • needs to know 
  • sharing agreements 

CDSs need to be robust in their ability to transfer data, at times maintaining one or more of data confidentiality, integrity, and availability.

CDSs typically sit at a network boundary and are often subject to persistent attacks through adversaries, including well-funded nation state actors, “beating on the gate” to compromise the Confidentiality, Integrity, or Availability (the CIA triad) of data and connected networks. Therefore, CDSs need to be robust in their ability to transfer data, at times maintaining one or more of data confidentiality, integrity, and availability. Being a significant ask for any single system, many CDSs can be complex and involve processes that need to work in conjunction. Unsurprisingly, the process to ensure that a CDS is suitable for a deployment environment can be lengthy and involve significant cost. 

The time to deployment (TTD) for a CDS can be prolonged due to the number of gates that needs to pass before the CDS can be fielded. Each CDS requires numerous design reviews, before and during development and testing. Ultimately, a CDS undergoes an extensive Lab Based Security Assessment (LBSA) and a Site Based Security Assessment (SBSA) as the final gate prior to the CDS connecting to live networks and processing real data. To pass an LBSA, there needs to be verification that the CDS remains compliant with various baselines, including the Raise the Bar (RTB) Cross Domain Solution (CDS) Design and Implementation Requirements. Given the critical role that a CDS plays in perimeter defense, the list of requirements and security controls are significant. 

In a world where technological strides increase seemingly logarithmically, the pace of a CDS approval adds to the probability that it will never get deployed when needed. So, how does one expedite a CDS approval? 

Modularity and Automation 

We can take cues from the developments in Security Development Lifecycle (SDLC) practices, Cloud Native computing, and DevSecOps. These developments cover a broad spectrum of methods and technologies. Collectively, they address the automation of constructing, testing and verifying systems that are secure, modularized, flexible, and resilient. Applying these processes leads to significant time reductions in the building and approval lifecycle. They reduce the risk associated with accrediting solutions and critically, the risk with operating the solutions in a connected environment. 

So, what does this look like in practice? Let’s assume you follow the SDLC, you have identified your security standards, requirements, use cases, and you want to build a CDS based on modularized components to get the most bang for your development time and budget. If designed well, a single recomposable CDS utilizing modularized components can address several data transfer needs, and designing and building your DevSecOps infrastructure is equally as important. Managing the complexity of building, testing, and verifying a recomposable system occurs through automation in your infrastructure. 

Data passes through multiple data filters, then to the portion of the CDS residing in the next domain, where it moves through additional data filters prior to exiting the CDS to its final destination via another Protocol Adapter.

Time spent building your DevSecOps infrastructure at the beginning will yield positive results long-term as you take advantage of the tooling and speed provided throughout your development lifecycle. Building, deploying, testing, and catching security issues and bugs early in your process leads to tighter feedback loops, while the automation reduces human error. The use of a DevSecOps infrastructure can deliver significant time and cost savings. 

How a Data Transfer Pipeline Works 

Data enters the portion of a CDS residing in one domain through a Protocol Adapter. It passes through multiple data filters, then to the portion of the CDS residing in the next domain, where it moves through additional data filters prior to exiting the CDS to its final destination via another Protocol Adapter.  

Protocol Adapters tend to be protocol specific, like FTP or HTTP. Filters are often crafted or selected, specifically for different types of data. Some examples would be XML schema validators or high/low pass audio filters. For a CDS to manage the transfer or multiple types of data multiple filters may be required. Some may require other filters occupying specific locations within a data transfer pipeline. 

Building a Recomposable CDS Using Modular Components 

In the simplest of scenarios, building a recomposable CDS using modularized pipeline components requires testing of several compositions of Protocol Adapters and different data filters and their placement within a pipeline to ensure that the CDS meets the necessary functional and security requirements. With the addition of data filter configurations as additional compositions, each valid filter configuration parameter brings the potential to dramatically increase the number of compositions requiring extensive testing. If that seems daunting, take for instance producing a non-modular CDS that uses a specific set of Protocol Adapters and corresponding filters, and then repeat that production process for a different set of Protocol Adapters and filters. Each CDS undergoes separate accreditation processes, at a minimum requiring an LBSA that analyzes the changes between system variants (a delta LBSA). As you can see, this doesn’t lend itself to being very efficient for an engineering team or vendor from a time and cost perspective, or an approval and deployment process-perspective. Modularity needs to be an engineering requirement and automation an essential component. 

Containerization of Protocol Adapters and filters provides modularity, easy delivery, and virtually eliminates software dependency issues. Containerization is consistent with the Cloud Native approach to building and deploying applications and is easily supported by common DevSecOps tooling. Containerization is also reflected in the RTB as an accepted method for building modular pipelines. One significant difference however is that server-model container engines that use controlling daemons, such as Kubernetes or Docker, are not permitted within a CDS. The benefits of containerization and DevSecOps can still be realized by utilizing other container technologies that do not rely upon controlling daemons. Through the automation inherent to DevSecOps, individual containerized components can be tested and verified to work in a modular environment. Pipelines composed of a variety of containerized components can also easily be constructed and tested via automation. 

Containerization of Protocol Adapters and filters provides modularity, easy delivery and virtually eliminates software dependency issues.

Programmatically, many compositions of modular pipelines can be constructed using containerized components. And through automation, specific configurations can be applied to components, a full battery of tests (unit, functional, integration, security) run, and results generated in a shorter time frame than if not using modularization and automation. This testing of many compositions and scenarios leads to tighter feedback loops between engineers when ensuring that a CDS meets requirements. Issues are found and resolutions produced earlier in the development cycle. This use of automation will increase the likelihood of success when a CDS undergoes an LBSA. Similar to something Neil Armstrong once said, we need to fail in here so that we don’t fail out there. 

In summary, CDSs are complex and the time to develop and deploy a single-purpose CDS is significant. Through the utilization of best practices such as containerization, automation, and building in security from the start and throughout the development lifecycle, it is feasible to produce a modular and recomposable CDS for data transfer between domains that meets the necessary security requirements and still adheres to deployment timelines. 

At SealingTech, our team of experts draws upon their decades of CDS, DevOps, and DevSecOps experience to support the federal government and private organizations with all aspects of CDS design, development, and independent verification. 

Interested in a quote? Contact our team today. 

Related Articles

The Call for Explainable AI

Enhancing Network Visibility with Machine Learning Artificial intelligence (AI) and machine learning are transforming business processes across industries. For many organizations, data has become their most valuable asset. The ability…

Learn More

Unsupervised Learning for Cybersecurity

Dashboards and automated alerts remain well-established fundamental components of nearly every cybersecurity team’s toolbelt. Peel back the layers of a network monitoring tool suite, and you’ll discover that every team…

Learn More

Operator X: An Intern Experience 

SealingTech’s exciting new innovation Operator X is a chat interface built to assist cyber operators by bridging knowledge gaps via the use of cutting-edge generative AI tools and techniques. It…

Learn More

Could your news use a jolt?

Find out what’s happening across the cyber landscape every month with The Lightning Report. 

Be privy to the latest trends and evolutions, along with strategies to safeguard your government agency or enterprise from cyber threats. Subscribe now.