Newsroom

Blog

Key Takeaways from Biden's Sweeping Executive Order on Cybersecurity

On Wednesday May 12, the Biden administration took a critical step towards addressing security issues that have come to light after several recent, high profile cyberattacks. The extensive Executive Order (EO) described the government's plan to increase cybersecurity protection across the public and private sectors as well as secure the nation's digital infrastructure against the type of attack that recently shut down the Colonial Pipeline, a critical source of fuel for the entire East Coast. The 30-page Executive Order on Improving the Nation’s Cybersecurity covers a plethora of cybersecurity issues. It describes how government agencies should evaluate the software they buy. It mandates that executive branch agencies deploy multifactor authentication, endpoint detection and response, and encryption. And it calls for these agencies to adopt "Zero Trust" architectures and more secure cloud services. Let’s take a look at three of the key areas in the Order: Prioritize Zero Trust The EO mandates that executive branch federal agencies create "Zero Trust" environments. The administration says this is key to ensuring security when implementing cloud computing environments and services and modernizing the IT infrastructure of the federal government. The document notes that within 60 days, the agencies must update plans to prioritize the adoption and use of cloud technology as well as develop a plan to implement zero trust architecture. Adopting a Zero Trust mindset is not only a critical element of a robust cybersecurity posture, but also a popular one. This is primarily because it doesn’t innately trust any user or application until verified by multi-factor authentication (MFA) and also doesn’t require much CapEx to get off the ground. Zero Trust compliance ultimately rests on two main pillars: Strong identity and access management, and a mature data identification and classification framework. That means that to implement a true Zero Trust framework, organizations need to know everything about their sensitive data (including personally identifiable information, payment card information, intellectual property, and other sensitive data types): When it is created and by whom, where it is stored, and how and with whom it can be shared. Related reading: Why Zero Trust is So Hot Right Now - And How Titus Can Make it Happen Address Supply Chain Risks The EO notes that the commercial software used by federal agencies often lacks adequate controls to prevent attackers from gaining access and states the federal government must take action to rapidly improve the security and integrity of the software supply chain, with a priority on addressing critical software. Within 30 days of the order's signing, the secretary of the Department of Commerce - acting through the director of the National Institute of Standards and Technology (NIST) - must solicit input from federal agencies, the private sector and academia. The government will then use this information to develop guidelines and criteria to evaluate software security and the best practices software developers must use. From recent events we know that no organization is immune to the risk of supply chain cyberattacks and data breaches, and those with especially large and complex supplier ecosystems are even more vulnerable. This has been exacerbated in recent months due to the pandemic and the expanded attack surface as a result of a more widely dispersed workforce. The main challenge here is that smaller organizations have neither the resources in personnel nor the capital to protect themselves and therefore the other organizations in the chain. Creation of a Cybersecurity Review Board The EO calls for establishing a "Cyber Incident Review Board" modelled on the National Transportation Safety Board. The positive is that the board’s membership shall include Federal officials and representatives from private sector entities. Therefore, in theory, this board should encapsulate the best of the public and private sectors, be unafraid to ask the ‘tough questions’ following a significant cyber incident and make concrete recommendations for improving cybersecurity. The challenge, however, is that the board will have to walk a fine line of complying with the Federal Advisory Committee Act, which forces boards like this to be "objective and accessible to the public," while also keeping the information it collects safe. Minimizing and Preventing Cyberattacks The goal of the EO is to modernize the government's IT infrastructure while creating a set of standards to help minimize the damage caused by cyberattacks. With aggressive timelines in tow and a clear directive to securely move to the cloud, this EO is arguably the most important step the President could have implemented. The three areas that we have highlighted in the EO all require organizations to take a more robust approach to data security. This is where Fortra data security platform can help, as our suite of products is designed to bring an organization’s data security policy into this modern hybrid reality with multiple ways of working with a highly distributed workforce. We have data security solutions that help ensure intellectual property and sensitive data is kept safe and secure. Our products run right across the various data protection requirements from classifying data inside the organization at the outset, through to detecting and preventing leaks of sensitive information outside the organization. As cyberthreats around the world continue to increase I am sure we will see more legislation and orders, like the EO, coming to the fore, therefore, demonstrating that you have a solid data security foundation in place and that you have layered security to help mitigate risk is going to be paramount. Keep your most sensitive data in the right hands​ SCHEDULE A DEMO
Blog

Top Data Security Challenges Organizations Face Today

Addressing an organization’s data security challenges requires some heavy lifting – no question about it. Whether data security worries center around internal security lapses or stem from the harsh reality of being targeted by those with malicious intent, organizations face a constant need to be on the alert and protective of sensitive data. Rather than cobble together a piecemeal solution strategy, relying on a trusted solutions provider that offers a suite of integrated, scalable data security solutions can provide relief. Knowing what data needs to be protected, classifying the data, applying controls to the data without slowing down business processes, and sharing all this sensitive data securely can provide IT and security leaders peace of mind. The Challenge of Gaining Data Visibility With the massive amount of data exchanged daily, knowing what data exists, where it lives, who can access it, and how it is ultimately sent is critical to organizational data security. The visibility factor is naturally a concern for CISOs, as a recent Fortra data security study attests and is square one when it comes to data security and the policies and solutions needed for a proactive security stance. Diving into true data visibility includes defining policies and procedures, ensuring they are working and being used, and then assessing which technologies can be put in place to help automatically and efficiently bolster the security needed around sensitive data. Related Reading: Data Security Best Practices Every CISO Should Know The Challenge of Identifying What Data Needs Protection To keep the flow of business running for mission-critical communications and not throw unnecessary productivity barriers up, it’s important to first address the fact that not all the vast amount of data exchanged is equal and in need of extensive protection. Organizations implementing a data classification solution that applies markers to only halt the data which meets the level of protection criteria you set can help ensure business keeps running, (minus potential data breaches). Metadata labels allow other security solutions within the environment to understand which data is sensitive and requires further protection along its journey based on the organizational policy set. With data classification in place, you can identify and sort out what data is sensitive and in need of protection and which is more mundane and shareable without the more nuanced layers of security to streamline secure data exchanges. The Challenge of Data Protection Efficiency Many traditional data security solutions end up blocking “safe” data alongside the potentially malicious or harmful data they are meant to stop. These false positives or false negative alerts can quickly spiral out of control, unnecessarily slowing down the flow of business. These traditional solutions focus on tight control, but at a cost. At some point, the data handcuffs can get too restrictive and the need to share and access easily (and securely) becomes a top priority for productivity. However, protecting data throughout its lifecycle is not a one size fits all process. Putting an Adaptive Data Loss Protection (A-DLP) solution in place can take organizations beyond the “block everything” mode by going on the defense to detect and prevent unauthorized sharing before any breach occurs. With DLP in place, organizations gain flexibility and can intelligently inspect and sanitize both structured and unstructured (meta) data within emails, files being transferred via web or cloud, and endpoints to ensure the specified security policy is applied automatically. This flexibility is of particular importance to highly regulated industries and to adhere to data privacy laws such as HIPAA, PCI-DSS, CCPA, GDPR, and more, which specify the level of protection that should surround data at all points in its journey. Related Reading: What is Adaptive Redaction? The Challenge of Sharing Files Securely and Efficiently Once data has been classified and sanitized, the challenge of sending it to a third-party or internally must be met. A secure managed file transfer (MFT) solution can rise to the challenge while meeting stringent compliance requirements for end-to-end protection. Automated workflows, as well as auditing and reporting functionality, add increased security and transparency around file transfers large and small. This reduces the human factor risks so often responsible for file transfer errors. Combining MFT with Adaptive DLP can further ensure that any files sent and received do not contain sensitive data. Related Reading: 3 Powerful Examples of MFT and DLP Paired in Action The Challenge Remote Work Poses As organizations reimagine how and where work gets accomplished, a growing number of workers will continue working from wherever is most convenient and at times on their personal devices. While this flexibility is mostly welcomed, it does not come without data security threats. Employees, of course, are among an organization’s most valuable assets, but they also pose some of the biggest risks without education, intelligent technology solutions, and policies and procedures that are easy to follow to ensure data security. Data is unquestionably more vulnerable with this more flexible work environment and the human factor continues to pose threats. When people are busy, tired, or pressured is when mistakes around securing data tend to be made. Related Reading: Increased Home Working is Recognized by CISOs as Cybersecurity Threat The need to communicate and collaborate securely remains and the risk of exposing sensitive data both within and outside of the organizations grows higher with more user access points and the ad hoc use of non-approved collaboration and file transfer processes. Organizations need mechanisms that let people work yet have a safety net to protect them (and their employers) from doing the wrong thing data security-wise. With more demand for functionality comes more risk in sharing data with third parties or via the cloud, upping the risk of a data breach or compliance requirement failures. The Challenge of Managing Multiple Security Solutions While it’s easy to see that layers of security can help freeze insecure data movement in it tracks, reduce human error risks, and ensure that even hidden sensitive data isn’t inadvertently accessed, managing those layers with multiple vendors can create productivity bottlenecks. One way to take alleviate pressure on IT staff is to work with a single trusted vendor capable of delivering multiple layers of security for operational simplicity. This can help ensure that your data classification, data loss prevention, and managed file transfer tools are well integrated and scalable. If the elements that make up your data security suite are not easy and intuitive to use, it will lose its effectiveness as the last barrier to employees making a data security error. Webinar: Data Security Challenges: How Our Suite Helps Facing Data Security Challenges with a Security Suite A solid security suite is one flexible enough that it enforces your security policies, rather than force processes into the solution itself. One benefit of employing a suite-style solution is that it can be implemented in modular fashion. You can deploy a single software solution to address today’s specific data security issue and be comfortable knowing you can add additional layers of security as your needs grow and change. In addition, you can take advantage of solution integrations and enjoy economies of scale. Data security can encompass any one or a combination of these technologies:

Digital Guardian Wins SC Media 2021 Trust Award for Best DLP Solution

WALTHAM, Mass. – May 4, 2021 – Digital Guardian, a leader in data loss prevention (DLP) and managed detection and response (MDR), today announced that the Digital Guardian Data Protection Platform has been recognized as a Trust Award winner in the “Best Data Leakage Prevention Solution” category for the 2021 SC Awards. The announcement was made online Monday, May 3, 2021 as part of SC Media’s 2021...