The Data Lake Hiding in Your Backups
DON 10:30 - 11:00
Data warehouse projects have an alarmingly high failure rate – leading analysts estimate anywhere from 50% to 64% over the last 15 years. These projects don’t fail from a lack of good tools in the market. They fail because of the time, expense, and complexity of centralizing, storing, updating, and transforming enterprise data so that it can be unlocked for other uses.
But what if you had already built a data lake covering all of your production workloads, with data stretching back for months or years and updated daily? What if this data lake was accessible, easily portable, and available for analysis using any set of tools? And what if the expense and staffing to create and maintain this data lake was already part of your budget and had been for years?
Your organization’s backup data represents exactly this resource. We all have backups to recover from outages, human error, malware, and disasters. But imagine you could take advantage of your backup data in order to:
- Analyze historical data sets with no impact on production workloads
- Migrate workloads into the cloud to take advantage of cloud-based analytics resources
- Fully automate DevOps processes using daily backup data
- Perform ransomware and compliancy checks and audits
- Comply with GDPR data subject access requests (DSAR’s)
- And numerous other use cases…