A 2019 report commissioned by Digital Guardian and other technology vendors from ESG (Enterprise Strategy Group), Trends in Cloud Data Security: The Data Perimeter of Hybrid Clouds, found that 50% of survey respondents know that they’ve lost data residing in the public cloud. With multiple users simultaneously accessing multiple environments from different geographic locations, hybrid cloud environments pose unique security challenges. In fact, three-fourths of respondents (75%) say that they believe that at least 20% of their company's data is insufficiently secured in the public cloud.
While 83% of those surveyed said that their companies intended to increase their data security spend over the subsequent 12-month period, it's clear that many organizations are moving sensitive data to public cloud environments before having adequate security measures like cloud security monitoring and cloud data loss prevention (cloud DLP) solutions in place to protect it.
To help companies better prepare for securing sensitive data in hybrid environments, we reached out to a panel of data security experts and asked them to answer this question:
"What are the most important best practices for data security in hybrid environments?"
Meet Our Panel of Data Security Experts:
Read on to learn what our panel had to say about the data security best practices your business should be following when working in hybrid environments.
Joe Bailey
Joe Bailey is a Business Development Consultant at My Trading Skills.
"The best practices for data security in hybrid environments are…"
Automation of all possible security avenues including coding the infrastructure of the hybrid environment, as well as its security.
Automation of the process of checking security compliance and regulatory baselines. Scanning of the security controls should be automated and carried out using open-source tools and technology.
High-level encryption of all data, whether it is at rest or in transit. This includes full-disk/partition encryption, hardware encryption, as well as IPsec.
Bottom line: Best practices for keeping data secure in hybrid environments include the automation of all possible security-related processes, and regulatory compliance scanning, as well as implementing highly advanced encryption of the data both at rest and while in transit.
Chelsea Brown
Chelsea Brown is a Family Information Security Consultant CompTIA Security+ Certified and CEO of Digital Mom Talk.
"The greatest risk in hybrid environments is company employees…"
When you have an environment that can be accessible from anywhere that stores and shares data seamlessly with your teams, that also leaves lots of back doors for someone to gain access to the system.
I've seen system backups compromised because an employee forgot to log out of a company computer on a public Wi-Fi. I've also had a company experience a ransomware virus that jumped through their telnet connection to their backups and corrupted those, resulting in the company having to pay the ransom. I've had families face criminal charges for sharing an iCloud account that a child uploaded criminal materials to accidentally. These are just some of the risks of hybrid environments.
Hybrid environments are best protected by issuing low level security and access to employees at each level. Companies can fortify these measures with training their employees on proper security practices like secure logins off-site and avoiding scams. Encrypting the cloud files as a precaution isn't a bad practice either in case of a data breach.
Data breaches for cloud companies are another huge threat. If companies go with free public services like Dropbox, then they pose the risk of everything they're working on being compromised WHEN Dropbox has a data breach. And they may not even be notified of it for months while Dropbox investigates.
Ilia Sotnikov
Ilia Sotnikov is an accomplished expert in cybersecurity and IT management and VP of Product Management at Netwrix, a vendor of information security and governance software.
"When data is spread across cloud and on-premises repositories, organizations need to take a holistic approach to its security…"
First, data should be managed throughout its whole lifecycle, which includes accurate classification, protection, detection of security threats and timely response to them, recovery, and compliance. To this end, organizations have to 1) automate classification of data across all data sources, and 2) improve data visibility and control with a monitoring solution.
Second, gain tight control over data access rights with an identity and access management solution. This will enable you to ensure that only authorized staff has access to data and decrease security risks.
Third, admit that you probably cannot secure all your data. Therefore, continue taking a holistic approach but keep focused on the most critical data identified with ongoing classification.
Michael Schenck
Michael is the Director of Security Services at Kaytuso.
"The most important best practice for data security in hybrid environments is…"
Ensuring that you are using the equivalent controls for the cloud portion as well as the on-premises portion of your network. One of the biggest myths in cloud services is that the host will take care of the common security functions – Identity and Authorization Management, Firewalls, and Backups. The reality is the cloud provider may offer some tools, but it is on the client IT provider or their MSP/MSSP to properly configure and secure the cloud as if it were local. This point is very high level and all-encompassing.
The Cloud and Hybrid environment at a conceptual level needs to be looked at as if it is all in your building. Sure, you don't have to worry as much regarding the hardware for the cloud, and any cloud vendor worth its price maintains passing results on their SOC2 audits, so you must focus on the other pieces that directly correspond to in-house networks.
One of the biggest things that go unchecked is backups. Google, Microsoft, and AWS all offer backup options, but backups for these platforms are not automatically included and need configuration and implementation. You wouldn't go without backing up a local file server, so you need to ensure the cloud is configured the same as if it were on site.
Another important piece is Data Loss Prevention. Whatever DLP service is in use, it must be effective at the file level as well as the system level. If I have access to a file on SharePoint or Dropbox, can I download it to a USB drive and then open it on my personal computer without my Microsoft Account? What about if I copy the contents of a spreadsheet into a new file, does the DLP solution detect the protected information and automatically apply it to the new file? If the answer is no to either of these hypothetical scenarios, then you may have a problem.
Markku Rossi
Markku Rossi is the CTO of SSH.COM.
"Configuration management is one of the most critical security requirements when it comes to…"
Managing a company's infrastructure, including on-prem, hybrid, and cloud. If you misconfigure your database (or any other service), or change configuration without understanding the affected components, you might open up your service components (databases, messaging queues, internal APIs, etc.) to the public Internet. And, considering how quickly you can scan the Internet these days, this means that after a configuration error is made, it could take as little as an hour before it's found by security researchers or hackers.
Shayne Sherman
Shayne Sherman is the CEO of TechLoris.
"There are a lot of factors that go into keeping data safe in a hybrid cloud situation…"
One of the biggest factors at play is still human error. The best way to stay on top of this is to first make sure that employees go through regular security training. Making sure that everyone is up-to-date on the latest phishing tactics can be a life saver.
You can also take this one step further and ensure that users only have the access they need to successfully do their jobs. Restricting access helps reduce the impact of a phishing attack, should one occur.
Gabe Turner
Gabe Turner is the Director of Content at Security Baron, a website dedicated to cybersecurity.
"Here are some best practices for data security in hybrid environments…"
Use updated anti-virus and anti-malware software so you can be alerted to any security threats. Use a password manager to make sure that everyone has a unique, long, and complicated password for each account. You can also add two- or multi-factor authentication for additional security. Two-factor authentication would require the user to enter a passcode from a mobile device, while multi-factor authentication involves biometrics like fingerprint or facial recognition. Physically protect your cloud data by having a backup copy offsite, either locally, in different regions in the same area, or in different regions. Make sure your cloud software encrypts all of your data end-to-end.
Uladzislau Murashka
Uladzislau Murashka is an information security expert and a Certified Ethical Hacker at ScienceSoft, an IT consulting and software development company. Uladzislau's spheres of competence include reverse engineering, black box, white box and gray box penetration testing of web and mobile applications, bug hunting, and research work in information security.
"To ensure data security within your hybrid environment…"
Encrypt the data transmitted through the cloud or stored within your network, automate patching and configuration management of the on-premises systems and cloud services in use, and regularly back up your critical data to avoid data loss in case of cyberattacks or system failures.
Doug Howard
Doug Howard is the VP of Global Services at RSA.
"The cloud can be as secure, or insecure, as any traditional IT infrastructure…"
I’m an advocate of Digital Transformation and leveraging cloud within that transformation. However, bad hygiene on digital transformation processes from cloud, third-party vendor management, and the actual hygiene of your cloud instance is no less challenging than traditional IT. Those that say the cloud is more secure are fooling themselves.
Other elements – multi-factor authentication, encryption of data in transit and at rest, proper firewall and edge management – are still important, even in the cloud. Key management helps protect against many hygiene issues. Continuous monitoring and scanning should be layered in when authorized; otherwise, unintended authorized users or systems can gain access.
Even when you do everything ‘right’ (invest in security, have great people, and take precautions), basic hygiene – misconfiguring a firewall – can bite you.
Kendal Newman
Kendal Newman CISSP, is the VP of Infrastructure and Security Operations at Projetch Inc. He manages and oversees the infrastructure and security for all Projetech Data centers and systems.
"Hybrid environments give companies the best of both worlds…"
They offer the customizability and flexibility that comes with public cloud offerings along with the control found in private cloud environments. Many of the security controls are similar between the public and private cloud. A key security concept for hybrid environments is to approach the concept as a shared responsibility. Set clear standards and understand what role each party plays in the security environment.
Make sure the cloud providers have adequate security frameworks in place and are audited on an annual basis. These security frameworks can include items such as the ISO 27001 Security Framework and NIST Cybersecurity Framework. The cloud provider should be audited around these frameworks yearly to validate they are following security best practices.
To ease access across hybrid environments, set up an identity and access management framework that protects all environments. Identify where all data is located and protect the data with adequate logging, access management, and encryption.
Make sure all security tools work across on-premise and public cloud environments and make appropriate configuration changes to adapt to the public cloud environments. Many of the remaining security steps are common across all environments, such as enforcing least privileges, enabling encryption to protect all data, and backing up systems on a regular basis.
Reuben Yonatan
Reuben Yonatan is the Founder and CEO of GetVOIP.
"I have a couple of suggestions for securing data transmitted over a network…"
Encrypting each network session is one way to prevent the interception or alteration of data in motion. If your hosts communicate over Internet Protocol (IP), you can use IPSec – or IP Security – which leverages cryptography.
If you aren't sure about all of the techno-mumbo-jumbo in articles like these, one secure route is to only utilize products that meet Federal Information Processing Standard (FIPS) Publication 140-2. Only cryptographic modules capable of protecting sensitive data meet this government standard.
Suni Munshani
Suni Munshani joined Protegrity as CEO in May of 2011 to accelerate growth and execute strategies to extend Protegrity’s leadership position in the enterprise data security market. He brings more than 25 years of broad and diverse global business experience to Protegrity.
"Treat cloud vendors like you would any partner…"
If you would not trust your partners to possess your sensitive data in the clear, do not give it to them. If you would not trust them to hold an encryption key, do not give it to them. Address these major questions before moving critical data to the cloud:
- Does the public and/or private cloud provider sufficiently address your enterprise’s security requirements?
- What happens in the event of data breach?
- What happens if the government subpoenas your data or data co-located with your data?
- Who actually owns the content stored in the cloud?
- If you cancel a cloud subscription, what happens to the data?
- How do we address data residency requirements while using the cloud?
- What are the options relating to BYOS and data-centric security in the cloud?
Implement and manage your own approaches and technology for data security – specifically, before any data is sent to the cloud. They should match at least the level of security that you expect in internal environments, such as databases or file systems.
Make your security systemic. Approach the problem by first selecting technology that can provide the required security services. This also means that, in many cases, you don’t allow the cloud provider to control your data. Instead, you leverage a security approach and technology that spans from your enterprise to the cloud, allowing you to control data security systemically, in any places that the data exists.
The best practices for cloud security are not centered on your public or private cloud provider. They are more about people, processes, and technology.
Shannon Giedieviells
Shannon Giedieviells is the Business Development Manager at Bedrock Cloud Solutions.
"It's important that a business evaluates where it wants its data and applications to live…"
Some businesses go totally off-site with public cloud, and some have a private cloud, but many utilize a hybrid cloud approach.
In terms of data security for highly volatile industries, like healthcare, their data can sit on-premise or in colocation where data can be carefully monitored.
It's all about finding a balance: Small businesses vs. enterprises will have different needs, and a hybrid cloud solution for their data may not be the right one. Instead, they would need a private cloud.
That's why a cloud consultant like Bedrock Cloud Solutions can help businesses navigate the process and evaluate their needs to find a reliable, compliant provider.
Nathanael Coffing
Nathanael Coffing is the Co-Founder of Cloudentity.
"Data security in the hybrid cloud is about access control…"
Why?
In a world of dissolving perimeters, the onus of data security has moved from encryption of data at rest in the soft gooey center of the data center to who can access what and when from a myriad of distributed applications across the partner, customer, and cloud landscape.
Access control has turned from a dirty little secret to a foundational requirement to protect data both at the data store and within the applications themselves.
What’s changed?
Old Model:
- Firewall - Who can talk to this IP?
- Load Balancer - Who can talk to this VIP?
- API Gateway - Is the token presented valid?
- IAM platform - What user/group is the user in?
- Application - What data can I share with the requestor?
There are few organizations that can present a unified view of the above access controls, but it’s what’s required to properly secure the hybrid cloud.
Steve Tcherchian
Steve Tcherchian is the Chief Product Officer at XYPRO. He is on the ISSA CISO Advisory Board, the NonStop Under 40 executive board, and is part of the ANSI X9 Security Standards Committee. He is a regular contributor to and presenter at the EC-Council.
"The cloud train has already left the station as is already down the track…"
Accessibility of the cloud has made it easy for organizations to quickly deploy applications and store data. There are too many market advantages to be able to quickly (and securely) deploy technology to the cloud.
This is where the right resources need to be available and the business prioritizing the effort. It is the responsibility of every business to follow best practices in terms of security configurations, credential storage, permissions, vulnerability management, monitoring, and more. The same accessibility that many organizations enjoy about the cloud also facilitates crimes of opportunity as in the CapitalOne breach. Not following through on this strategy and not having the right skills in place leaves your cloud platform just as insecure as not securing an on-premise environment.
Storing your data on the cloud does not absolve you or shift your data security responsibilities to someone else. If you own data, your business is the one responsible for protecting it, regardless of where it physically resides. The same strategy, controls, and monitoring used to secure your on-premise assets needs to be used to secure your cloud infrastructure – but adapted for the cloud.
Dmitry Sotnikov
Dmitry Sotnikov serves as Vice President of Cloud Platform at 42Crunch – an enterprise API security company – and also maintains APISecurity.io, a popular community site with daily API Security news and a weekly newsletter on API vulnerabilities, breaches, standards, best practices, regulations, and tools.
"Data is often said to be the new gold, but even more often, it works as the new uranium…"
It’s extremely valuable for sure, but also potentially highly contagious and outright dangerous when stored in large quantities and poorly guarded. Hybrid environments significantly expand the potential attack surface because in such architectures not only is the data split or replicated between multiple locations and services run by different entities (client, vendor, cloud infrastructure, and other services that the vendor is using), but it’s also exchanged over the internet through publicly available APIs. Below are just a few best practices and potential security aspects to be aware of:
- Figure out where the data resides – Create a full inventory of all the locations and services that get the data. You cannot protect what you are not aware of.
- Assess which data is sensitive – Assign the level of sensitivity to each piece of the data. That can help you in prioritizing and securing the data that is the most critical for your business, and can help make rational decisions on what can go to the cloud or a particular service or location and what cannot.
- How is the data stored and who has access? – How is the data stored in each location? Is it encrypted? Who has access? How is that access provisioned, deprovisioned, regulated, monitored, and audited?
- Are there any services that have API access to the data or its replica? – Don't forget about any auxiliary systems and services that get data access or create its replicas, such as backup, search, and indexing. So many systems fell victim to attackers finding an unprotected backup server or ElasticSearch instance.
- What is the level of API security for the systems with the data access? – Far more often, data gets breached not through direct physical access but via APIs. Hybrid systems rely on APIs to exchange data and invoke the services across their components. APIs give attackers a vastly expanded attack surface and ability to launch their attacks remotely.
- How are API keys stored and protected? – APIs are typically secured and accessed with API keys. These are effectively passwords letting anyone get access to the services and underlying data (!) on your behalf. Beware of APIs keys leaking through source code repositories, developer workstations, configuration files, reverse-engineered application code, intercepted communications, and so on.
William Taylor
William Taylor is based in California as the Career Development Officer at MintResume.
"In a hybrid environment, full-disk or partition encryption is one of the best ways of protecting your data at rest…"
Opt for an operating system that supports full-disk encryption, such as the Linux Unified Key Setup-on-disk (LUKS) format. LUKS encrypts your hard drive partitions in bulk so that, while your computer is off, your data is protected.
Bottom line: Full-disk or partition encryption is one of the best ways of data security.
Christopher Gerg
Christopher Gerg is the Vice President of Cyber Risk Management at Gillware. He is a technical lead with over 15 years of information security experience. Christopher has worked as a Systems Administrator, Network Engineer, Penetration Tester, Information Security Architect, Vice President of Information Technology, Director and Chief Information Security Officer.
"At the end of the day, the best practices are unchanged whether the services and data are hosted on-premises, in the cloud, or in a hybrid configuration…"
The challenge is keeping in mind the additional complexity involved while maintaining those best practices. LAN-to-LAN VPN connections, encrypted data transfers, logging and monitoring, and perhaps a separate team that administers the cloud environment.
My advice is mostly to treat it like a hosting provider from which you lease hardware (someone else's computers in someone else's data center, running on someone else's network). Keep things standardized and consolidated if possible – use the same logging and monitoring tools, use existing network interconnection mechanisms (VPN), use the same authentication and authorization mechanisms, integrate the cloud ops team into the central IT team, etc...
At the same time, it presents an opportunity to leverage some very capable and redundant services that are hosted in the same cloud for the on-prem servers. It might make sense to have everything log to a central cloud-based logging mechanism, cloud-based HSM key management tools to handle all encryption keys, or use a cloud-based authentication mechanism for everything, or perhaps leverage the database facilities in the cloud instead of hosting them locally.
Clever IT departments have found ways to leverage the cloud as a very powerful mechanism for disaster recovery and business continuity – they replicate all of their servers periodically to multiple cloud regions and only turn them on when testing, updating, or in a disaster only paying for the storage when the machines are not running.
My advice is to leverage the cloud for things that it is good for – highly available applications, or demand-based loads (large analysis jobs or websites that have widely varying workloads, for example). At the end of the day, you pay for what you use, so it requires discipline and automation to take advantage of the potentially significant savings (things like the orchestration tool Kubernetes are a great example).
Where it gets a little nuanced is when compliance obligations step in. For example, in HIPAA-relevant organizations hosting services and data in the cloud, encryption becomes vital – data stored and transmitted (even between an organization's internal servers) must be encrypted.
Alberto Pavon
Alberto Pavon owns IT & Data Services, a data recovery service in Mexico.
"When it comes to data security in hybrid IT environments…"
You should always use a different data recovery strategy for your cloud infrastructure than the one you are using for your centralized IT. It is often tempting to use the same legacy recovery method across the board, but those two work differently, and as such they require different methods.
Also, testing is critical. Make sure to test your data recovery plan regularly, and simulate data outages to keep on top of different process changes that naturally occur over time. Finally, failing to plan is planning to fail. This old saying really applies to hybrid IT. Bad things always happen when a proper operational and data recovery plan is not put together. From computers going down, to people losing their data, all the way to privacy violations, you can avoid those nasty things by properly planning data security for your hybrid IT systems.
Sage Driskell
Sage Driskell is an IT Security Engineer at The 20.
"Highly secure cloud environments limit data exfiltration by controlling where the data can actually be saved or accessed…"
A user can theoretically export data, but it is much harder to exfiltrate a large amount of data. These security settings mean absolutely nothing in a hybrid environment where the on-premise device does not match the cloud environment (with no further security measures or policies), or in an environment where the information can flow between either unobstructed. Securing these environments to either match or to have certain data in certain places means that you reduce what an attacker can get from a given compromise as well as preventing employee espionage from being as efficient.
Even something as simple as IP whitelisting can make a world of difference. Using a VPN (either site-to-site, or a VPN into the cloud environment) can make things even more secure. Your cloud environment almost assuredly has privileged access to your on-premise piece, and your on-premise appliance or environment almost assuredly has even further privileged access in the site. There are a dizzying number of places which try to be secure, but fail to implement something this basic leading to serious compromises.
Hybrid environments also suffer from the fact that you can only fully control one part of the environment. The cloud provider must be properly sourced or else their compromise is your compromise. All the security in the world is completely useless if the backdoor is wide open. You are only as strong as your weakest link.