Data Compliance Doesn’t Have to be Slow

By Nancy Patel

Nancy Patel is VP/GM of Public Sector at Immuta. She holds graduate degrees in Systems Engineering from George Washington University and Business Administration and Management from University of Kentucky (UK) as well as an undergraduate degree in Computer Science from UK.

Nancy Patel is VP/GM of Public Sector at Immuta.  She holds graduate degrees in Systems Engineering from George Washington University and Business Administration and Management from University of Kentucky (UK) as well as an undergraduate degree in Computer Science from UK.

PRIVATE SECTOR PERSPECTIVE — It happens more often than it should – a Pentagon data owner, often a manager of a collection platform, has spent months working with an agency analytics team to complete a data sharing Memorandum of Agreement (MOA). Getting all the rules written down, the regulations cited, the policies documented, reviewed, and approved was a heavy lift. Then, add to that the weeks it took to pass the PDF around to the one individual in each organization who was allowed to approve it. Finally, the process is complete, and the data can be shared, often copied from one repository to another. The new repository might be someone’s hard drive, or it might be a second data platform with no data access controls in place at all – allowing anyone to access everything.

Hopefully, the analytics organization will find the time, developers, and money to implement the protections from the MOA – still a PDF signed in ink that one hopes has been stored somewhere safe and not just lost in an overcrowded inbox. This scenario also assumes the information shared remains internal to an organization. If a data owner wants to now share that data outside of the organization… well, forget about it.

If this anecdote appears dramatized, it is not. In a surprising number of cases, the US Government handles and manages data this way, and even with all of these protections in place, we still witness unauthorized disclosures and data leaks. The policies on data protection for information sharing are right, but the implementation of these policies needs further refinement. As data architecture and engineering teams work through the pain of the data sharing compliance process, often taking longer to complete than the creation of the system itself, many observers are left to wonder if it has to be this hard. Has the pendulum swung too far from implementation to process? Have we regulated ourselves into a corner?

Data access controls and policy enforcement have become a check-in-the-box process with a long list of rules and regulations that often get shoved aside in favor of getting data into the hands of consumers faster. Speed of execution is good, but today, this also means that either the data never sees the light of day in the receiving organization – it’s too risky! – or the data is improperly controlled, and it’s up to the consumer to do the right thing. In the current system of paper, informal agreements, and having to throw decisions over the wall to developer teams to implement, how can data consumers be sure that they can trust the data they access?

The DoD’s Data Strategy, released in September 2020, reiterates the Department’s tenet that, for data to be “trustworthy” users must “be confident in all aspects of data for decision-making.” This includes knowing that the data they have in their hands is data they are allowed to see and share. In order for data consumers to be confident in their access to data – as well as their products – they need to know that the policies outlined in compliance documents, MOUs, and MOAs have been properly implemented. In other words, proper policy implementation and enforcement is a good thing, assuming the US Government has a reusable, automated system for implementing these policies both globally and locally.


The Cipher Brief hosts private briefings with the world’s most experienced national and global security experts.  

Become a member today for just $10/month.


Compliance across legacy systems is one of the biggest bottlenecks in the deployment process and causes many consumers who should be able to access data in real time to be completely cut off from that information because they are siloed-off from the information sharing environment. In order to truly make our systems future-proof, we must address this manual process, and make collaboration, implementation, and enforcement of data access policies easy to administer, fast, and scalable. It should not require a development team and many months to implement. It must be dynamic and adaptable in real time as policies and regulations change. It should allow consumers’ accesses to change based on their purpose for accessing the data. And we should not require the system to be re-implemented every time a new data source or a new app / access pattern is added to the platform. There are 2 steps we can take to make this goal achievable.

  1. Add compliant access to sensitive data, into the data engineering workflow. What this means is adding the “G” into ELT. Data engineers will Extract, Load, and Transform the data while also implementing Governance – global policy authoring and enforcement across the data platform. Adding the “G” into the “ELT” process enables more rapid development of data products – no more separation of processes and long tails for compliance; it’s done on the front end instead – and improves interoperability between systems by injecting dynamic policy enforcement directly into the workflow.
  2. Automate policy implementation and enforcement, wherever possible. This means building a data governance system that, wherever possible, does not require a human in the loop and manual intervention with every approval. Instead, create systems that can dynamically apply policies already agreed in order to automate data access by consumers. This step alone has the ability to save analytics and data science teams days of downtime as they wait for approvals.

In addition to the two steps above, the Department is already implementing a DevSecOps and continuous Authority to Operate (ATO) approach to the security approval process in a move to automate accreditation. Combining these three efforts – alignment of governance with data engineering, automation of data governance, and automation of ATOs – will allow teams to field safe, effective operational capabilities much faster and get the tools the warfighter needs into the right hands faster.

The Department of Defense is all in on automation wherever it’s feasible. Policy implementation and enforcement is the perfect candidate for this approach. It could massively and quickly lower the barrier to entry for data sharing, the creation of new data platforms, and keeping those data sharing models and data platforms up to date and aligned with the newest, most effective approaches to data processing and management. In taking these steps, platform owners can ensure the right information is getting to the right people, faster and more efficiently.

Have a Cyber/Tech national security related perspective to share?  Drop an email to [email protected]

Read more expert-driven national security insights, perspective and analysis in The Cipher Brief

 


Related Articles

Israel Strikes Iran

BOTTOM LINE UP FRONT – Less than one week after Iran’s attack against Israel, Israel struck Iran early on Friday, hitting a military air base […] More

Search

Close