
Insights from recent episode analysis
Audience Interest
Podcast Focus
Publishing Consistency
Platform Reach
Insights are generated by CastFox AI using publicly available data, episode content, and proprietary models.
Most discussed topics
Brands & references
Est. Listeners
Based on iTunes & Spotify (publisher stats).
- Per-Episode Audience
Est. listeners per new episode within ~30 days
10,001 - 25,000 - Monthly Reach
Unique listeners across all episodes (30 days)
25,001 - 75,000 - Active Followers
Loyal subscribers who consistently listen
15,001 - 40,000
Market Insights
Platform Distribution
Reach across major podcast platforms, updated hourly
Total Followers
—
Total Plays
—
Total Reviews
—
* Data sourced directly from platform APIs and aggregated hourly across all major podcast directories.
On the show
From 10 epsHosts
Recent guests
Recent episodes
Can Your MDM Strategy Survive the Shift to Real-Time AI Decision-Making?
Apr 30, 2026
Unknown duration
Why Is the Semantic Layer Critical for Data Governance, Compliance, and AI at Scale?
Apr 20, 2026
27m 13s
AI Is Replacing BI — Here’s What CIOs Need to Know
Apr 8, 2026
29m 12s
Why Data Quality Makes or Breaks AI Success in Supply Chain and Procurement
Mar 24, 2026
32m 08s
Revenue-Ready Data Is Not Magic, It’s Engineering
Mar 19, 2026
30m 10s
Social Links & Contact
Official channels & resources
Official Website
Login
RSS Feed
Login
| Date | Episode | Topics | Guests | Brands | Places | Keywords | Sponsor | Length | |
|---|---|---|---|---|---|---|---|---|---|
| 4/30/26 | ![]() Can Your MDM Strategy Survive the Shift to Real-Time AI Decision-Making? | Podcast: Don’t Panic! It’s Just Data Guest: Jignesh Patel, Director of Product Strategy at Stibo Systems and Elsebeth Gundersen Jensen, Product Owner at NetsHost: Dr Joe Perez, Data Analytics Expert and Amazon Bestselling AuthorWe’re living in times of an always-on digital economy where there’s no room for data errors. In the recent episode of the Don’t Panic It’s Just Data podcast, host Dr Joe Perez, Data Analytics Expert and Amazon Bestselling Author, sat down with Jignesh Patel, Director of Product Strategy at Stibo Systems and Stibo Systems’ customer, Elsebeth Gundersen Jensen, Product Owner at Nets. Perez pointed out that even the smallest inconsistency can "ripple completely across an entire operation, instantaneously." This reality is prompting enterprise tech leaders to rethink how they manage, govern, and use data, especially with the rapid growth of AI adoption.Overall, the guests send out a clear message – trusted, real-time data is now a crucial part of business infrastructure.Also Watch: From Chaos to Launch: Your Product is Ready, Your Data Isn'tWhat is the Hidden Cost of Untrusted Data?For large enterprises, especially those growing through mergers and acquisitions, fragmented data systems are almost unavoidable. Jensen noted that when combining multiple customer portfolios, inconsistencies often arise in even the simplest fields, like organisation numbers formatted differently in various systems.“When you bring in different customer portfolios, you will also get this scattered data picture that you don’t want in a master data management system,” she explained.According to Patel, the lack of trusted data impacts four key areas which includes customer experience, revenue growth, decision-making, and operational efficiency. Without a unified customer view, enterprises struggle to offer personalised experiences or spot cross-sell opportunities. Moreover, analytics based on unreliable data undermine executive confidence and increase compliance risks.These issues are made worse by speed. Alluding to her observations, Jensen told Perez and Patel that modern customers expect contract changes or service interactions to be updated almost instantly. “They don’t want to wait a day,” she stated. “Everything should be faster, better, and accurate.”Also Watch: Why is a Customer Data Strategy a Competitive Edge?How are Enterprises Mastering Intelligence?Traditionally, Master Data Management (MDM) has focused on creating the “golden record,” a single, reliable version of key business entities like customers or products. While this remains important, Patel believes this idea is changing quickly in the AI era.“MDM is moving beyond data correctness towards what I call mastering intelligence,” he said. “AI systems rely on trusted context—understanding what entities are, how they relate, and the business rules that apply.”This change is part of a larger transformation in enterprise architecture. Decision-making is no longer limited to human-driven dashboards; it is increasingly spreading across applications, analytics platforms, and AI agents acting in real time. In such a setup, inconsistent data does not just create errors but it can amplify it.“AI doesn’t eliminate the need for MDM or data governance. It emphasises it,” stated Patel. For enterprises heavily investing in AI, this insight is vital. Without a strong data foundation, AI models might provide insights but not dependable results.As enterprises move toward AI-driven and even agent-based business models, the need for trusted data will grow even more important. Patel highlights new questions from the C-suite – How will AI agents find my products? Why isn’t my business being recommended?The answer increasingly depends on structured, high-quality data. “AI success is dependent on trustworthy data,” Director of Product Strategy at Stibo Systems says. “MDM and governance are the foundation for the next generation of intelligent business systems.”For enterprise leaders, the key directive to note is in the race to implement AI, data trust is the competitive edge and not only the requirement. Key TakeawaysReal-time trusted data is essential for enterprise AI success and operational resilience.Poor data quality directly impacts customer experience, revenue growth, and compliance.Modern Master Data Management (MDM) is evolving from “golden records” to AI-ready data intelligence.Proactive data governance must replace reactive data cleanup to scale in real-time environments.A unified data model is the foundation for accurate, consistent, and AI-driven business insights.Chapters00:00 Introduction to Data Governance and MDM02:06 The Shift to Real-Time Data05:27 Business Risks of Lacking Trusted Data08:20 Growth Through Mergers and Acquisitions15:29 The Role of MDM in AI Initiatives20:02 Transitioning to Proactive Data Management22:01 Advice for CIOs on Managing Product DataFor more information, please visit em360tech.com and stibosystems.com. To learn more about AI in the MDM space and how they’re progressing enterprise analytics intelligently, follow:Stibo Systems LinkedIn: @StiboSystemsStibo Systems X: @StiboSystemsStibo Systems YouTube: @StiboSystemsGlobalEM360Tech YouTube: @enterprisemanagement360EM360Tech LinkedIn: @EM360TechEM360Tech X: @EM360TechFollow: @EM360Tech on YouTube, LinkedIn and X#MDM #DataGovernance #EnterpriseAI #DataQuality #TrustedData #AIStrategy #RealTimeData #DigitalTransformation #StiboSystems #TechPodcast | — | ||||||
| 4/20/26 | ![]() Why Is the Semantic Layer Critical for Data Governance, Compliance, and AI at Scale?✨ | data governancecompliance+2 | Adrian Estala | StarburstBARC+1 | EuropeSwitzerland+1 | data lakesdata warehouses+1 | — | 27m 13s | |
| 4/8/26 | ![]() AI Is Replacing BI — Here’s What CIOs Need to Know✨ | AIBusiness Intelligence+1 | Adrian Estala | PathfinderStarburst+10 | — | CIOsenterprise tech+1 | — | 29m 12s | |
| 3/24/26 | ![]() Why Data Quality Makes or Breaks AI Success in Supply Chain and Procurement✨ | data qualityAI+2 | Pascal Bensoussan | IvaluaDare to Data+2 | — | technologybusiness+3 | — | 32m 08s | |
| 3/19/26 | ![]() Revenue-Ready Data Is Not Magic, It’s Engineering✨ | artificial intelligencemachine learning+2 | Paul BrownellSergio Morales | machine learninggenerative AI+2 | — | AI strategiesdata governance+1 | — | 30m 10s | |
| 3/18/26 | ![]() How to Navigate the Trust Paradox in AI Adoption: Insights from Informatica’s 2026 CDO Report✨ | AI adoptiondata governance+3 | Kevin PetrieNathan Turajski | InformaticaBARC+6 | — | CDO Insights 2026enterprise AI budgets+2 | — | 27m 39s | |
| 3/10/26 | ![]() Are You Scaling Intelligence — or Just Scaling Errors?✨ | AIdata management+2 | Herb Blecher | AImachine learning+5 | U.S. | data analyticsmachine learning+2 | — | 27m 51s | |
| 1/30/26 | ![]() Is AI Analytics the Missing Link Between Business Users and Data Teams?✨ | AI AnalyticsData Teams+2 | Barry McCardel | HexHex Technologies+2 | — | data trustself-service+1 | — | 36m 20s | |
| 1/14/26 | ![]() How To Scale AI in Digital Commerce Effectively✨ | AIdigital commerce+2 | Jürgen ObermannPiotr Kobziakowski | LuceneVespa+5 | — | scaling AIreal-time+1 | — | 25m 12s | |
| 1/13/26 | ![]() The Modern CFO is the Product Owner of Data✨ | CFOdata ownership+3 | Pavel DolezalVineta Bajaj | KeboolaB2B Tech+6 | London | data governancefinancial leadership+1 | — | 22m 30s | |
Want analysis for the episodes below?Free for Pro Submit a request, we'll have your selected episodes analyzed within an hour. Free, at no cost to you, for Pro users. | |||||||||
| 12/11/25 | ![]() Responsible AI Starts with Responsible Data: Building Trust at Scale✨ | Responsible AIData Trust+1 | Amy Horowitz | InformaticaBARC+2 | Redwood CityCalifornia | AIdata+3 | — | 26m 00s | |
| 12/11/25 | ![]() The Missing Piece: How Data and AI Impact Management Unlocks Business Value | “What is the true value of our data and AI initiatives?” Too often, we drive all our energy into tools, processes, and outputs, but forget to ask ourselves how what we build actually makes a difference. For enterprises, this means looking beyond AI models and dashboards to see how our data drives real, measurable impact. Understanding the difference between output and outcome is what separates activity from transformation.In this episode of Don’t Panic, it's Just Data, host Doug Laney and Nadiem von Heydebrand, CEO and Co-founder of Mindfuel, explore how organisations can turn data and AI efforts into actionable business outcomes. They discuss the concept of the “value layer”, a framework connecting data initiatives to business needs, emphasising the importance of understanding business problems before developing solutions.Nadiem stresses that prioritising initiatives and fostering strong collaboration between business and data teams are critical to unlocking maximum value from data and AI efforts.Why Data and AI Impact Management MattersMany organisations are investing heavily in data and AI, but turning these investments into real business value remains a challenge. This is because a critical gap exists between technical execution and business outcomes. Data and AI teams work on initiatives without first clarifying what business problems they're solving or how success will be measured.Data and AI Impact Management bridges this gap by establishing the “value layer" between business strategy and technical platforms. This approach starts with structured demand management for use cases, enables systematic prioritisation based on actual value potential, and tracks initiatives throughout their lifecycle to ensure they deliver impact against business goals. This shift, from building solutions in search of problems to solving qualified business problems with purpose-built solutions, transforms data and AI teams from technical support functions into strategic partners who deliver value, stronger strategic alignment, and lasting competitive advantage. Nadiem says, “Applying a product mindset within data initiatives is key, and it's the foundational effort to be able to drive value.” He also notes that not every use case delivers direct financial impact, and the value layer helps clarify demand, manage use cases effectively, and uncover each initiative’s business valueFor more insights and solutions, visit MindfuelTakeawaysOrganisations struggle to connect data initiatives to business outcomes.The value layer is essential for linking data to business demands.Understanding the actual business problem is crucial for success.Value management encompasses the entire lifecycle of initiatives.A product mindset helps focus on outcomes rather than outputs.Not all data use cases have direct dollar values.Data and AI impact management creates transparency for data teams.Establishing a product mindset is key for data products.Connecting processes to the operating model enhances effectiveness.Collaboration between business and data teams is vital for unlocking value.Chapters00:31 Introduction: Don't Panic, It's Just Data01:37 The Missing Piece: Introducing the Value Layer 07:11 Value Management Lifecycle10:46 Product Mindset in Data Initiatives14:10 Distinguishing Value and Impact 17:04 Impact Management and Investment Justification 19:34 Mindfuel's Three-Step Guide to Impact Management 21:00 Conclusion and Key TakeawaysAbout MindfuelMindfuel is a data and AI impact management platform that gives data, analytics and AI teams a single source of truth to prioritise high-impact use cases, connect initiatives to business outcomes, and demonstrate ROAI. It replaces scattered tools and reactive, manual processes with a structured approach to managing use cases and data and AI products. This enables organisations to reduce business case bias, eliminate inefficiencies, and clearly communicate the value of AI initiatives, driving enterprise-wide trust, transparency, and impact. | — | ||||||
| 12/9/25 | ![]() The AI-Ready Data Core: Creating the Foundation for Intelligent Systems | As AI becomes a central pillar of business decision-making, enterprises face a new challenge, and that is making their data AI-ready. It’s no longer enough to collect and digitise information. For organisations, data must be structured, contextualised, discoverable, and usable—both by humans and intelligent systems.AI can only deliver if your data is truly ready, but most enterprises are drowning in fragmented, incomplete, or slow-to-update data. In this episode of Don't Panic, It's Just Data, host Doug Laney and Sushant Rai, Vice President of Product of AI and Data Strategy at Relito, explore how modern data unification strategies are changing enterprises, enabling AI to deliver faster, more reliable insights. They focus on the shift from traditional Master Data Management (MDM) to next-generation AI-ready data cores, uncovering the risks of fragmented data and the strategies to overcome them.Why AI-Ready Data MattersAI, especially large language models (LLMs), is changing how people interact with data. Analysts, executives, and frontline teams now expect natural language queries and instant, actionable insights.Sushant explains:"AI performs at its best when it has full context, empowered with the right data. This allows AI agents to make decisions and take actions on behalf of your business."When you embed intelligence into your data layer, AI can help you manage and scale your data without drowning your teams in manual work. This will only work if your data is structured, clean, governed, and constantly updated, everything that makes it truly AI-ready.The Data Scale ChallengeThe volume of data being turned over daily is staggering. As Sushant notes:"The amount of data getting generated every single day is so massive that there’s no way to keep up without AI. Even the largest organizations, with massive data stewardship teams, can’t catch up manually."This gap is driving the change in the modern data platforms, where AI automates stewardship, enriches data continuously, detects anomalies, and maintains quality in real time.Want to learn more about modern data unification and AI-ready platforms? Visit Reltio.com for insights, resources, and case studies.TakeawaysData unification provides a trusted, real-time view of key business elements.Organizations must balance speed and trust in data management.Classic MDM is evolving into modern data unification platforms.Real-time data access is crucial for AI and analytics.AI can enhance data quality and governance processes.Successful data initiatives require clear business outcomes and ownership.Data unification should be viewed as a business platform, not just an IT project.AI agents will play a significant role in automating data governance.Organizations need to focus on both structured and unstructured data.The future of data management involves continuous unification and enrichment of data.Chapters00:00 Introduction to Data Unification and AI07:52 The Importance of Data Unification in Enterprises15:44 AI and Data Quality Management23:20 Organizational Success Factors for Data Initiatives25:16 Future Trends in Data and AIAbout ReltioAt Reltio, we believe data should fuel your success in the enterprise AI era. Reltio Data Cloud™ is the agentic data fabric for the enterprise—powering real-time data intelligence and AI transformation. Reltio’s cloud-native SaaS platform delivers unified, trusted, and context-rich data across domains in real time. With Reltio, organizations gain 360-degree views of customers, products, suppliers, and more—mobilized in milliseconds to any application, user, or AI agent. Trusted by the world’s largest enterprises across life sciences, financial services, healthcare, technology, and more, we help organizations fuel frictionless operations, drive innovation, and reduce risk. | — | ||||||
| 11/18/25 | ![]() From Data Steward to AI Strategist: Redefining the Role of the CDO in the Agentic Era | While the role of a chief data officers (CDOs) was traditionally focused on regulatory compliance, it has now expanded to empowering the consistent and effective use of data across organizations to improve business outcomes. One of the most effective ways for CDOs to demonstrate their value is by developing a data strategy that is closely aligned with business goals, processes, and outcomes. In the latest episode of Tech Transformed, host Kevin Petrie, VP of Research at BARC, speaks with Brett Roscoe, Senior Vice President and GM of Cloud Data Governance and Cloud Ops at Informatica, about the evolving role of CDOs. Their conversation explores how CDOs are transitioning from data stewards to strategic leaders, the importance of data governance, and the challenges of managing unstructured data.The Role of the CDO in the Agentic EraAs Roscoe notes, “CDOs are now pivotal in AI strategy,” reflecting how the role has grown from compliance oversight to guiding enterprise initiatives that directly support organizational goals.In this day and age, CDOs are tasked with ensuring that data is both accessible and reliable, providing a foundation for informed decision-making across business units. This includes establishing policies for data quality, access, and governance, which Roscoe highlights as essential: “data governance is foundational for AI.” At the same time, unstructured data ranging from documents and emails to multimedia adds complexity that requires careful management to make it useful while minimizing risk. “Unstructured data presents challenges,” he adds, emphasizing the need for structured oversight to fully leverage these assets.AI StrategyAlthough technology and analytics are evolving rapidly, the CDO’s role in aligning data with strategic initiatives is critical. By connecting data assets to business processes, CDOs help ensure that initiatives are informed by reliable, well-governed information and can deliver measurable results.For anyone looking to understand the evolving responsibilities of CDOs, the importance of governance, and strategies for handling unstructured data, this episode of Tech Transformed provides a detailed and practical discussion.For more insights, follow Informatica:X: @informaticaInstagram: @informaticacorpFacebook: https://www.facebook.com/InformaticaLLC/LinkedIn: https://www.linkedin.com/company/informatica/TakeawaysCDOs are now central to shaping AI strategies and driving business growth.Robust data governance is crucial for the successful deployment of AI technologies.Unstructured data presents unique challenges and opportunities for AI development.A balance between centralized governance and federated operations is essential.Securing executive support is vital for the success of CDO-led initiatives.Engaging business stakeholders enhances the impact of AI projects.Demonstrating ROI through clear metrics is key to sustaining AI investments.AI governance must extend beyond data to include models and agents.New measures are needed to ensure the quality and governance of unstructured data.CDOs must navigate the tension between fostering innovation and maintaining governance standards.Chapters00:00:00 Introduction to the Podcast and Guests00:03:00 Brett Roscoe's Background and Role00:06:00 The Evolving Role of CDOs00:09:00 Data Governance as a Foundation for AI00:12:00 Challenges with Unstructured Data00:15:00 Governance Frameworks for AI and Data00:18:00 Centralization vs. Decentralization in Data Governance00:21:00 CDO Strategies for Success00:24:00 Conclusion and Future OutlookAbout InformaticaInformatica, founded in 1993, is an enterprise data management company headquartered in Redwood City, California. The company provides software products for data integration, data quality, master data management, and data governance. With approximately 9,000 global customers across various industries, Informatica has positioned itself as a significant player in the data management market. | — | ||||||
| 11/6/25 | ![]() Is Your Financial Reporting Ready for the Future? | The challenge all organisations, big and small, face is answering and implementing solutions to solve this key question: How can finance and accounting teams work faster, smarter and more accurately?In the recent episode of the Don’t Panic It’s Just Data podcast, host Scott Taylor, The Data Whisperer and Principal Consultant at MetaMeta Consulting, speaks with Kevin Gibson, CPA and Principal Solutions Engineer at insightsoftware. They talk about the constantly changing nature of financial reporting. Additionally, they discuss the pros and cons of modern financial reporting and the importance of connecting financial data with familiar tools like Excel. The conversation also touches on the future of financial reporting technology and the need for organisations to adapt to changing data access needs.Uncertainty in a Data-Driven World“With all this uncertainty, companies are being asked to look at their data in different ways. They want to pivot it, slice it, and dice it,” Gibson tells Taylor, encapsulating the theme of this episode. “They’re being told to do more with the data — what does it mean, how do we read it, how do we understand it, how do we analyse it?”The issue is that, as enterprises invest in digital transformation, finance teams struggle most with limited access to the data they need to support their analysis.“The ideal state,” Gibson adds, “would be: I can get what I want, when I want, and how I want it — without asking questions. But let’s be honest — that doesn’t exist today.”However, the good news is the data exists, Gibson says. The ugly part is that organisations can’t get to it. Many of the data accessibility issues have been attributed to cloud migration. “When you move your data to the cloud, you think: it’s cheaper, it’s more secure, it’s easier to maintain. But here’s the problem: you don’t control it anymore. Some cloud providers make access difficult or costly. So finance teams feel stuck,” he explains.Also Watch: Stop Fighting Excel: How to Turn Your Spreadsheets into a Real-Time Reporting Powerhouse?Real-Time Access on ExcelFor decades, finance professionals have relied on Excel, which Gibson refers to as the “largest data warehouse in the world.” “There are 1.1 billion users of Excel today,” he says. “And let’s be honest, I haven’t met an accountant yet who says they hate it.” Finance prospers in Excel, but IT often views it as a risk. This leads to a constant back-and-forth between usability and control. Gibson believes that the solution is to equip both sides- finance and IT with real-time, governed data inside Excel. That’s where insightsoftware comes in. “We can connect directly to these systems and give finance teams back their real-time access — not just to pieces of data, but all of it,” says Gibson. “Literally every piece of data can be accessed.” With tools like Spreadsheet Server, finance professionals can work in Excel — their “comfort food,” as Gibson calls it — while drawing directly from live ERP data in the cloud. “We give them insight — that’s what our software does. It gives them visibility into their data. Excel isn’t going away, and our job is to make it work even better.”To learn more, watch or listen to the podcast on EM360Tech.Also Watch: Struggling with ERP Data? How to Get Real-Time Reporting in ExcelTakeawaysFinance professionals are facing increased pressure to analyse data amidst uncertainty.The ideal state for finance teams is immediate access to data without barriers.Modern financial reporting has its good, bad, and ugly aspects, primarily revolving around data accessibility.Different industries have unique data needs and KPIs that impact financial reporting.Excel remains a critical tool for finance professionals despite the rise of cloud-based solutions.Organisations must find ways to connect their financial data with familiar reporting tools.The future of financial reporting will continue to evolve around Excel and data accessibility.Companies are beginning to realise the limitations of cloud-based systems and may revert to on-prem solutions.AI is emerging as a significant factor in data access and reporting.C-level executives should engage with finance teams to ensure they have the tools they need.Chapters00:00 Introduction to the Podcast and Guest00:59 Understanding the Role of Finance in Uncertain Times04:00 The Good, Bad, and Ugly of Financial Reporting08:13 Types of Data and Industry-Specific Challenges10:00 Connecting Financial Data with Reporting Tools13:59 The Future of Financial Reporting and Technology17:58 Key Takeaways for C-Level Executives | — | ||||||
| 10/29/25 | ![]() How enterprises can enable the Agentic AI Lakehouse on Apache Iceberg | "A flaw of warehouses is that you need to move all your data into them so you can keep it going, and for a lot of organisations that's a big hassle,” says Will Martin, EMEA Evangelist at Dremio. “It can take a long time, it can be expensive, and you ultimately can end up ripping up processes that are there."In this episode of the Don’t Panic It’s Just Data podcast, recorded live at Big Data LDN (BDL) 2025, Will Martin, EMEA Evangelist at Dremio, joins Shubhangi Dua, Podcast Host and Tech Journalist at EM360Tech. They talk about how enterprises can enable the Agentic AI Lakehouse on Apache Iceberg and why query performance is critical for efficient data analysis. "If you have a data silo, it exists for a reason—something's feeding information to it. You usually have other processes feeding off of it. So if you shift all that to a warehouse, it disrupts a lot of your business," Martin tells Dua. This is where a lakehouse comes into play. Organisations can federate their access through a lakehouse data approach. They can centralise access to the respective organisation’s lakehouse while keeping their data in its original location. Such a system helps people get started quickly.In terms of data quality, if you access everything from one location, even with separate data silos, you can see all your data. This visibility allows you to identify issues, address them, and enhance your data quality. That’s beneficial for AI, too, Martin explains. Lakehouse Key to AI Infrastructure?Lakehouse has been recognised for unifying and simplifying governance. An imperative feature of a lakehouse is the data catalogue, which helps an organisation browse and find information. It also secures access and manages permissions."You can access in one place, but you can do all your security and permissions in one place rather than all these individual systems, which is great if you work in IT,” reflects Martin. "There are some drawbacks to lakehouses. So, a big component of a lakehouse is metadata. It can be quite big, and it needs managing. Certain companies and vendors are trying to deal with that."With AI and AI agents, it’s become even harder to optimise analytics on a lakehouse. However, this has been improved as technical barriers are disappearing. Martin explains that anyone can prompt a question; for instance, an enterprise CEO could ask questions about the data and demand justifications directly. In the past, a request would have to be submitted, and then a data scientist or engineer would create the dataset and hand it over. Now, engineers' roles have changed to focus on better optimisation. They help queries run smoothly and ensure tables are efficient. Agents cannot assist with that.Also Listen: Dremio: The State of the Data LakehouseOptimise LakehouseVendors such as Dremio provide services to manage and optimise lakehouses. They offer autonomous features to help set up the workflow. Martin says that in many cases, Dremio learns from the clients’ actions and improves their system. “This is evident in our reflections, which are optimised datasets that speed up performance,” he added. “In other situations, we handle tasks like file compaction and garbage collection, which are often less exciting for engineers. Now, there’s no need for engineers to manage those tasks, which benefits everyone.”As a lakehouse provider, Dremio is Iceberg native. They began their journey as a lakehouse provider and continue down this road. Now the industry has shifted gears to focus on lakehouses too, first with Snowflake and now even Databricks, which has developed its format with Delta Lake.The ultimate goal is to incorporate more features—permissions, governance, and fine-grained access control. “These capabilities are things vendors typically sell, but they will soon become widely available for free,” Martin tells Dua.Learn More: Visit dremio.com for more information on open data lakehouse technology.Key TakeawaysAgentic AI and Apache Iceberg are current hot topics.Lakehouses offer quicker, less disruptive data access for AI compared to data warehouses.Centralised access in a lakehouse improves data quality and simplifies AI integration.Lakehouses, with their data catalogues, ease governance and permission management for AI agents working with sensitive data.Apache Iceberg is resolving metadata format issues, though metadata management remains an overhead.Dremio, an Iceberg-native provider, champions open source and interoperability, offering autonomous optimisation features to free engineers from mundane tasks.Beyond technology, a robust data strategy is crucial for organisational data improvement.Agentic AI will evolve to handle more delegated, multi-step tasks with less supervision.The open-source ecosystem will see consolidation and improved features, making advanced catalogue and governance tools widely available.Ultimately, for IT decision-makers, the quality of data is paramount for all analytical endeavours, including AI.Chapters0:00 - Introduction to Agentic AI0:35 - Discussing Big Data London, Hot Topics: Agentic AI and Apache Iceberg1:37 - Data Lakehouse vs. Data Warehouse for AI2:30 - Data Quality and AI with a Lakehouse3:18 - AI Agents and Sensitive Data: Governance with a Lakehouse4:19 - Challenges and Solutions in Lakehouse Technology (Apache Iceberg)5:47 - Dremio's Use Cases and Interoperability7:40 - Dremio's Standout Features and Autonomous Optimisation9:39 - The Importance of Data Strategy10:29 - Future of Agentic AI11:34 - Future of the Open-Source Ecosystem12:51 - Final Takeaway for IT Decision Makers: Data Quality is Critical13:51 - Conclusion | — | ||||||
| 10/27/25 | ![]() The Real Future of Data Isn’t AI — It’s Contextual Automation | At Big Data LDN (BDL) 2025, Keboola CEO Pavel Dolezal presented a new data agent designed for all business users, not just engineers. With a mission to make AI, automation, and data easy to access, relevant, and useful across the organisation, Dolezal revealed that the data agent has been embedded with contextual intelligence and generative AI.“While we typically assist data engineers with building the pipeline, we took the same data agent and built a different environment for it — a chat-like environment. By default, the chat has context, knows what to do, knows where not to go,” the Keboola CEO unveiled on the Don’t Panic It’s Just Data podcast. In the EM360Tech podcast recorded live at BDL, Dolezal spoke to Christina Stathopoulos, the Founder of Dare to Data in the recent episode of the Don’t Panic It’s Just Data podcast. They talked about the new Keboola Data Agent and it plays a key role in AI-backed change and the growth of large language models (LLMs) in business.Context for AI-Backed Data Strategy Matters More“Anyone can be technical now. It’s context that matters,” stated Dolezal, also the co-founder of Keboola. He presented a strong argument for why enterprise data strategies are falling short and how a new wave of smart tools will change that.“The pipeline of what you can do is limitless if you build it for people in business,” he added. “You can't keep data and AI just in the hands of engineers anymore. That model doesn’t scale.”As businesses face a growing number of data sources — sometimes over 300 SaaS platforms and more than 80 departments — managing, governing, and activating that data has become a challenge. The appealing promise of AI often adds another layer of complexity.When AI Adds More ComplexityEnterprise leaders were told that AI would simplify data workflows. Instead, many found themselves managing disconnected tools and failed pilot projects.“We all read the MIT study. 95 per cent of AI proofs of concept don’t make it to production,” the CEO of Keboola highlights. “Why? Because large language models (LLMs) need context. And enterprise data environments are anything but simple.”At Keboola, context is crucial, he emphasised. It includes not just metadata but also event logs, debug trails, and orchestration details, including the complete story behind every data product.“LLMs thrive on context. The more relevant context you provide, the better the outcome. But in today’s data stack, where your context is spread across 15 tools, that's nearly impossible.”This is where Keboola’s new Data Agent comes in. It is a generative AI interface built directly into data workflows, capable of understanding and acting on both the structure and state of a company’s data.Watch the podcast for further insights on EM360Tech. Key TakeawaysFocus on context and domain knowledge rather than technical skills.Ease of acquiring technical skills compared to the importance of understanding business processes.Agents will run business processes both internally and externally.Need for infrastructure to support agents Challenges of provisioning ad hoc environmentsUse agents to automate business processes.Importance of data governance.ChaptersIntroduction & Keboola's Mission: 0:00Challenges in Modern Data Management: 2:23Impact of Complexity on Teams: 4:04LLMs in Data: Potential and Pitfalls: 5:50Keboola Data Agent: Bridging the Gap: 8:12Evolution of Keboola Data Agent: 13:00Future Vision for LLMs & Agents: 15:05Key Takeaway for Leaders: 17:12 | — | ||||||
| 10/24/25 | ![]() Why Unstructured Data Governance is the Key to Scaling AI | With an ever-changing business climate, companies have begun to shift their focus to unstructured data. In the past, unstructured data was challenging to deal with, considering the volume, governance and compliance, so organisations mainly focused on structured datasets. However, with the rise of generative AI and large language models (LLMs), Reece Williams Griffiths, Field CTO of Collibra, says that we can no longer overlook 80 percent of enterprise content—from transcripts and PDFs to emails and images.In this episode of the Don't Panic It's Just Data podcast, host John Santaferraro, CEO and Head Research Analyst at Ferraro Consulting, talks with Griffiths, also Co-Founder and CEO of Deasy Labs (acquired by Collibra). They also talk about the change brought to Collibra after acquiring Deasy Labs. Governing Structured & Unstructured DataFollowing Collibra’s acquisition of Griffiths firm, Deasy Labs, he explains how this merger is making AI truly achievable for businesses. Deasy became renowned for its goal of simplifying data preparation. With Collibra, it’s leading the development of the tools necessary to create order from the chaos and build a unified AI enterprise.Together, they created the first unified governance and catalogue platform for both structured and unstructured data. This single-hub approach is vital for a future where AI agents treat all data types equally.Griffiths tells Santaferraro that, historically, Collibra, like others, focused only on structured data. Now, by combining Deasy’s capabilities, the platform provides a single entry point and a smooth experience for all data assets.One outcome of a unified data strategy is simplified AI use cases. Since AI applications often need to access both tabular data (structured) and documents (unstructured) to give complete answers, unification offers the necessary routing and flexibility, the Field CTO explains.Preparing Unstructured Data for AITo effectively use a huge quantity of unstructured content, it must be prepared. Griffiths describes a four-layer data preparation funnel that goes beyond simple classification to deep semantic embedding, ultimately creating a Knowledge Product.The talk of the moment is the knowledge data product, which the Collibra speaker says is familiar in the structured data scenario; however, not so much on the unstructured data. “We define a knowledge product with four elements – sensitivity, unstructured data quality, metadata for humans, and metadata for AI tools.”"One key difference to note about them is that in the structured data world, data products are typically consumed by analytics, data and AI teams. Knowledge products, conversely, I think will be consumed by everyone."Overall, Collibra’s system offers a multitude of solutions, including AI-based taxonomies which can automatically create meaningful taxonomies and segments directly from the data. This dramatically cuts down on the lengthy effort that manual mapping requires from subject matter experts.This shift lets companies focus on areas where effort brings real value. For example, businesses can scan supplier contracts to identify auto-renewal clauses that trigger in 30 days. Enterprises can prioritise high-value, validated items that need human review. As technology evolves, this entire system is moving toward AI-managed workflows, representing a significant advance toward an autonomous enterprise.Key Takeaways 85 percent of enterprise content is unstructured (documents, transcripts)Unstructured data is the new foundation for scalable AIGovernance must be unified (structured + unstructured) to simplify tools and serve AI agents effectively.Manual data labeling is impossible. AI/LLMs must automate metadata generation via a Continuous Tagging System.Leaders should adopt the Knowledge Product idea—a governed, AI-ready asset for unstructured data consumed by the entire enterprise.Chapters0:00: Intro: The Exciting Moment in Tech1:42: Unified Governance via Dz Acquisition4:27: The 4-Layer Unstructured Data Funnel7:35: Automation: Continuous AI Tagging12:55: The Value of Knowledge ProductsCollibra BioCollibra frees your data from the constraints of silos by unifying data and AI governance across your entire ecosystem, regardless of source or compute engine, for ultimate flexibility in how you manage data. Our Collibra Platform gives you automated visibility, control and tracing from input through output, and it automates documentation and data traceability for AI use cases to power speed, data observability and safety. Our enterprise metadata graph enriches data context with every use, and our intuitive UX brings technical and business users into the fold to access and steward data. | — | ||||||
| 10/15/25 | ![]() BARC DATA festival online: Practical AI Case Studies for Professionals | Organizations are increasingly exploring new technologies to improve their operations, but adoption comes with real challenges. In the latest episode of Don’t Panic, It’s Just Data, host Trisha Pillay speaks with Kevin Petrie, VP of Research at BARC, about the practical realities of integrating these emerging technologies into business operations, particularly when it comes to data stewardship, strategy, and operational oversight.Real-World Applications and GovernancePetrie begins by noting that these technologies are already delivering real value today, particularly in areas like software development and customer service. The key, he emphasizes, is starting with strong oversight. “Governance starts with data,” he says, pointing out that reliable, well-managed data is the foundation for successful adoption.Human oversight is equally important; automation alone cannot replace careful monitoring and decision-making. Effective governance also needs to extend beyond structured data to include unstructured information such as text, images, and other content types. As organizations adopt new models, they must be aware of the risks these systems introduce and put controls in place to mitigate them.BARC DATA festival onlineBuilding on these themes, BARC will host the DATA festival online on October 21, a virtual gathering designed to help leaders turn insight into action. The free event will bring together data professionals, decision-makers, and industry experts to share real-world use cases, operational frameworks, and lessons learned from practical implementations. The festival provides a clear roadmap for organizations seeking to make technology adoption both effective and sustainable.TakeawaysEmerging technologies are already delivering measurable value in business operations.Governance begins with reliable data and extends to all types of information, including unstructured formats.Human oversight is essential to maintain accountability and manage risks.Practical, real-world use cases illustrate successful adoption strategies.Cost efficiency, culture, and cross-functional programs are critical for sustainable implementation.Responsible, accountable practices are a must-have for long-term success.Chapters00:01: Introduction to AI Adoption01:00: Real-World AI Applications03:03: Balancing Automation and Human Oversight05:19: Governance and Data Strategy07:02: BARC DATA festival online preview09:54: Audience and Takeaways12:13: Rapid Fire Questions14:02: Closing ThoughtsAbout BARCBARC is a leading analyst firm for data and analytics and enterprise software, recognized for providing trusted, unbiased insights. Its expert analysts deliver in-depth research, advisory services, and industry events to help organisations make informed decisions about technology, data strategy, and analytics.With over 25 years of experience in data strategy, governance, architecture, and software selection, BARC empowers clients to become truly data-driven organizations. Its research highlights market trends, evaluates software and vendors rigorously, and provides actionable guidance to help enterprises innovate with data, analytics, and AI. | — | ||||||
| 10/13/25 | ![]() From Chaos to Launch: Your Product is Ready, Your Data Isn't | "Most companies still juggle with multiple different platforms; the communication between these tools and these platforms is happening in spreadsheets, and that is tedious, it's error-prone,” states Søren Lundtoft, Sr. Director of Product Management at Stibo Systems.On the Don't Panic, It's Just Data podcast, host Doug Laney and Søren Lundtoft dive into how business leaders can overcome digital-first challenges, especially when managing product information across a fragmented landscape. They discuss how PXDC—Product Experience Data Cloud—provides a unified, cloud-native solution for centralizing, governing, and syndicating product data. PXDC eliminates manual bottlenecks, automates data onboarding, and ensures every stakeholder works from a single source of truth, enabling faster launches and more consistent brand experiences.The conversation explores the future of product experience, highlighting how AI and agentic workflows are transforming data management. PXDC leverages AI to automate content creation, localization, and compliance, while its composable architecture adapts to rapid market changes. The ultimate advice for data leaders: design systems for change, build on a solid data foundation, and embrace PXDC to unlock agility and growth in the age of AIError Caused by Multiple PlatformsThe Sr. Director of Product Management at Stibo Systems identified the main issue as the use of multiple platforms and internal systems. He said that communication between these tools often occurs in spreadsheets, which are tedious and prone to errors. This leads to a situation where the last mile of data management becomes the longest mile, as companies struggle to share information with third parties. Alluding to the downsides of unstable downstream channels, where requirements change frequently. As he explained, a company might download a spec sheet from a retailer, but by the time they complete it and are ready to go live, a lot of changes have taken place. This ongoing state of change makes it nearly impossible to avoid mistakes and delays.Lundtoft ended with an important piece of advice for C-suite leaders. He urged them to "decide for change" and shape their data systems so they can adjust to quick advancements in the market. He believes that data is "more important than it is right now" and serves as the true foundation for the current AI wave.To learn more about the solution to managing product data and AI and its future, tune into the podcast on YouTube or Spotify at @EM360Tech.TakeawaysBrand owners face significant challenges in a digital-first world.Many companies struggle with multiple data platforms.Communication between tools often relies on spreadsheets.The last mile of data delivery is the most challenging.Consistency in product information is crucial for trust.AI can optimise data management processes.A single source of truth is essential for data governance.Data inconsistency can lead to costly errors.Future data management will involve agentic workflows.Adaptability in data models is necessary for future changes.Chapters00:00 Introduction to Data Challenges in a Digital World01:31 Navigating Complex Data Systems05:46 Ensuring Consistency in Product Information07:44 Streamlining Data Management for Efficiency11:35 The Future of Product Experience and AI15:04 Key Advice for Data Management LeadersAbout Stibo SystemsStibo Systems is a global leader in Master Data Management (MDM) solutions, dedicated to helping businesses achieve data transparency and operational efficiency. With a rich history dating back to 1794, Stibo Systems empowers companies to connect, govern, enrich, and syndicate their data across various domains, ensuring a single, accurate view of their information.Headquartered in Aarhus, Denmark, Stibo Systems is a trusted partner for many of the world’s largest and most innovative companies. Their solutions enhance the quality and value of master data, driving informed decision-making and business success. | — | ||||||
| 10/9/25 | ![]() Stop Fighting Excel: How to Turn Your Spreadsheets into a Real-Time Reporting Powerhouse? | With an erratic and fast business environment, finance teams are facing high pressure to process reports. The main challenge lies in how mid-market firms achieve digital transformation, not by abandoning familiar tools but by making them more effective. In this episode of the Don't Panic It's Just Data podcast, host Kevin Petrie, Vice President, Research and Head of Data Management Practice, BARC, speaks with Maeghan Carriere, Divisional Vice President, Software Sales NA, insightsoftware; and Nate Cook, Director of Product Marketing, insightsoftware. They discuss the importance of automation, the role of CFOs as strategic leaders, and, most importantly, how mid-market companies can leverage tools like Spreadsheet Server for digital transformation. The conversation also spotlights the need for finance professionals to upskill in AI and data analysis to remain competitive.All speakers agree that the key is in equipping finance professionals with direct, real-time access to their data in the environment they know and trust – Excel.CFOs as Strategic LeadersCook believes that the role of the Chief Financial Officer (CFO) and their team has changed massively. "We really expect them to look back only as it helps them find a path forward.”This new role requires finance teams to become effective data teams but how can they do that? A recent Gartner study found that 75 per cent of CFOs "said they own or co-own enterprise data and analytics at their organisations." Unfortunately, these teams "sometimes struggle to have access to the data that they need." The time wasted on tedious data entry and report generation is time lost for important analysis and strategic thinking."The short answer is Spreadsheet Server," Cook told Petrie, explaining that better decisions rely on better data, and Spreadsheet Server provides the latter.Also Watch: Struggling with ERP Data? How to Get Real-Time Reporting in ExcelExcel with AI Upskills Finance TeamsAutomating reporting doesn't just save time; it allows finance professionals to develop skills in areas like Excel, data analysis, and AI. Carriere points out that AI is an important tool that every professional needs to learn to use effectively."Once you understand how data flows and connects in real time, you're better positioned to use AI tools because you can quickly tell if the results make sense," she explains.The key message for IT decision-makers is clear, as per insightsoftware’s divisional VP, "Stop fighting Excel and make it more powerful instead." By embracing the tool that finance teams prefer, organisations can achieve quicker results, faster adoption, and ultimately free their financial experts from manual tasks. Cook also notes, Spreadsheet Server is "one way to help remove a lot of that toil and refocus the time that your folks are spending on the more strategic parts of analysis and decision-making that can help drive your organisation forward."TakeawaysFinance teams are facing scaling pressures and resource constraints.The need for speed in decision-making is critical for finance leaders.Automation can save finance teams significant time and improve productivity.CFOs are increasingly expected to be strategic leaders within organisations.Excel remains a preferred tool for finance teams and should be leveraged in digital transformation.Mid-market companies should identify specific areas for technology to deliver efficiencies.Real-time data access is essential for better decision-making.Upskilling in AI is necessary for finance professionals to remain competitive.Finance teams can develop analytical skills while automating reporting processes.Building trust in data is crucial for effective AI implementation.Chapters00:00 Introduction to the Podcast and Guests02:22 Current Challenges in Financial Reporting05:18 Transforming Finance Teams with Automation11:21 Digital Transformation for Mid-Market Companies13:43 Accessing ERP Data for Better Decision Making18:04 Upskilling Finance Teams for AI Integration20:45 Key Takeaways for CXOs | — | ||||||
| 9/10/25 | ![]() Data Experts Question: Is Data Infrastructure Ready for Responsible AI? | Welcome back to Meeting of the Minds, a special podcast episode series by EM360Tech, where we talk about the future of tech.In this Big Data special episode of the Meeting of the Minds, our expert panel – Ravit Jain, Podcast host, Christina Stathopoulos of Dare to Data and a data and AI evangelist, Wayne Eckerson, data strategy consultant and president of the Eckerson Group and Kevin Petrie VP of Research at BARC, come together again to discuss the key data and AI trends, particularly focusing on data ethics. They discuss ethical issues related to using AI, the need for data governance and guidelines, and the essential role of data quality in AI success. The speakers also look at how organisations can measure the value of AI through different KPIs, stressing the need for a balance between technical achievements and business results. Our data experts examine the changing role of AI across various sectors, with a focus on success metrics, the effects on productivity and employee stress, changes in education, and the possible positive and negative impacts of AI in everyday life. They highlight the need to balance productivity with quality and consider the ethics of autonomous AI systems.In the previous episode, new challenges and opportunities in data governance, regulatory frameworks, and the AI workforce were discussed. They looked at the important balance between innovation and ethical responsibility, looking at how companies are handling these issues.Tune in to get new understandings about the future of data and AI and how your enterprise can adapt to the upcoming changes and challenges. Hear how leaders in the field are preparing for a future that is already here.Also watch: Meeting of the Minds: State Of Cybersecurity in 2025TakeawaysGenerative AI is creating a supply shock in cognitive power.Companies are eager for data literacy and AI training.Data quality remains a critical issue for AI success.Regulatory frameworks like GDPR are shaping AI governance.The US prioritises innovation, sometimes at the expense of regulation.Generative AI introduces new risks that need to be managed.Data quality issues are often the root of implementation failures.AI's impact on jobs is leading to concerns about workforce automation.Organisations must adapt to the probabilistic nature of generative AI.The conversation around data quality is ongoing and evolving. AI literacy and data literacy are crucial for workforce success.Executives are more concerned about retraining than layoffs.Younger workers may struggle to evaluate AI-generated answers.Incremental changes in productivity are expected with AI.Job displacement may not be immediate, but could create future gaps.Human empathy and communication skills remain essential in many professions.AI will augment, not replace, skilled software developers.Global cooperation is needed to navigate the evolving AI landscape.Data quality is critical for mitigating risks in AI applications.Organisations must prepare for the proliferation of AI models.Chapters00:00 Introduction to the Future of Data and AI03:30 Significant Shifts in Data and AI Landscape06:52 The Role of Education and Training in Data Literacy09:59 Regulatory Perspectives: GDPR vs. US Approaches13:06 Risks and Challenges in Implementing Generative AI16:53 Data Quality: The Foundation of AI Success21:59 The Impact of AI on Jobs and Workforce Dynamics27:49 The Future of Workforce and AI Literacy30:15 Job Displacement and the Importance of Junior Roles32:05 AI's Impact on Professional Roles34:26 The Role of AI in Software Development39:54 The Necessity of Human Involvement in AI41:10 Data Governance and Global Cooperation | — | ||||||
| 9/4/25 | ![]() How RAG and Graph RAG Take Generative AI to the Next Level | Generative AI has captured global attention, powering everything from chatbots to intelligent assistants. Yet in the enterprise, its promise often hits a dead end. According to Gartner, 80 per cent of enterprise data remains unused or “dark,” because conventional AI struggles to interpret complex, domain-specific information.In this episode of the Don't Panic It's Just Data podcast, EM360Tech host Trisha Pillay speaks with Andreas Blumauer, Senior Vice President at Graphwise, about how retrieval-augmented generation (RAG) and its advanced application, Graph RAG, are levelling up enterprise AI. Together, they explore the limitations of traditional AI, the critical role of knowledge graphs in improving data accuracy, and what it takes for organisations to successfully adopt these technologies.Why Graph RAG MattersWhile RAG enhances Generative AI by enabling it to retrieve relevant data from large knowledge bases, Graph RAG takes it further. By integrating knowledge graphs, Graph RAG preserves the relationships, sequences, and meaning inherent in enterprise data. This ensures AI outputs are not just collections of facts, but structured insights that reflect the logic of an organisation’s knowledge.These advantages include:Higher accuracy: Retrieval precision can increase from 80% to 95%, reducing errors in AI outputs.Trustworthy results: Outputs are explainable and traceable, providing transparency that enterprises require.Scalable integration: Connects data across silos and departments, making AI adoption enterprise-ready.“Graph RAG respects the structure of enterprise data instead of flattening it. That’s what makes it trustworthy,” explains Blumauer.Generative AI opened the door to possibilities. RAG made it actionable. Graph RAG takes it to the next level. By transforming dark, siloed data into structured, actionable knowledge, Graph RAG helps organisations achieve the accuracy, trust, and scalability essential for navigating the next frontier of enterprise intelligence.Takeaways80 per cent of enterprise data remains unused or dark.Traditional AI struggles to interpret complex enterprise data.RAG retrieves information from within the enterprise data landscape.Graph RAG improves the accuracy of AI outputs.Knowledge graphs link data points across different silos.Building a knowledge graph is a strategic investment.Incremental growth is possible with knowledge graphs.Graph RAG can increase accuracy from 80 per cent to 95 per cent.Data quality and governance are essential for AI success.The future of enterprise AI relies on effective knowledge management.Chapters00:00 Introduction to RAG and Graph RAG03:04 Understanding the Importance of Knowledge Graphs05:46 Adopting RAG: Organisational Readiness and Strategic Investment08:51 Real-World Applications and Benefits of Graph RAG11:56 The Evolution of Knowledge Graphs in AI14:46 Future of GraphRAG and Enterprise AI17:36 Rapid Fire Questions and Closing ThoughtsAbout GraphwiseGraphwise is a leading enterprise AI company specialising in knowledge graph technologies. By combining retrieval-augmented generation (RAG) with advanced graph-based approaches, Graphwise helps organisations turn siloed, complex data into accurate, actionable insights, enabling smarter decisions, scalable integration, and trustworthy AI outcomes. | — | ||||||
| 9/2/25 | ![]() Winning with Data: Inclusion, Innovation and Community at Big Data LDN | Inclusivity and accessibility remain some of the biggest challenges for data events. True inclusion is not the result of a single initiative, but continuous effort, honest reflection, and a willingness to listen to the community. At Big Data LDN, these values are rooted in the event’s mission. By highlighting both established leaders and emerging voices, the conference creates a stage where diverse perspectives can shine, while fostering a welcoming environment where every attendee feels they belong.In this episode of Don’t Panic, It’s Just Data podcast, host Trisha Pillay is joined by Andy Steed, Event Director at Big Data LDN, and Roisin McCarthy, Founder of Women in Data. The conversation examines how Big Data LDN has become a cornerstone of the data community over the past decade and why advancing inclusivity is crucial to the conference’s continued impact in its 10th year.A Decade of Big Data LDNOver the past 10 years, Big Data LDN has grown beyond a traditional tech conference. As Andy explains, keeping the event relevant means constantly adapting to the changing needs of the data community. From established thought leaders to rising stars, the agenda is designed to showcase the breadth of talent driving innovation across industries.Roisin highlights how Women in Data has grown to an incredible 95,000 members worldwide, becoming an anchor of community engagement in the data industry. At Big Data LDN, this partnership translates into a stronger focus on inclusivity, mentorship, and ensuring that diverse voices are not only present but heard. The Women in Data Lounge at this year’s event will be a dedicated hub for networking, support, and inspiration.Inclusion and Accessibility in PracticeBoth guests stress that real inclusion doesn’t happen by accident. It requires intention, continuous reflection, and a willingness to address barriers head-on. Big Data LDN is taking deliberate steps to create a welcoming environment, one where accessibility, representation, and community-building are central to the experience.What Attendees Can ExpectBig Data LDN brings together world-class keynotes, hands-on workshops, and community-driven sessions, offering both deep technical insights and opportunities to connect. Networking is built into the event, with spaces designed for conversations that continue well beyond the conference floor. Taking place September 24–25, this year’s edition will be the largest yet, including a new conference that explores AI agents, AI governance and data products on September 23 for those seeking more focused, technical exploration.Listen to the full episode of Don’t Panic, It’s Just Data to hear more about how Big Data LDN is shaping the future of data events.TakeawaysBig Data LDN has significantly evolved over the past decade.Women in Data has grown to 95,000 members, enhancing community engagement.Inclusivity and accessibility are ongoing challenges in data events.Real inclusion requires continuous effort and reflection.Big Data LDN showcases both established and rising stars in data.The event aims to create a welcoming environment for all attendees.Expect exceptional content and interactive community events at Big Data LDN.Chapters00:00 Introduction to Big Data LDN and Women in Data02:37 Evolution of Big Data LDN and Its Impact04:39 Challenges of Keeping Events Fresh and Relevant08:36 Inclusivity and Accessibility in Data Events09:01 Creating Real Inclusion in the Data Space11:19 Opportunities and Challenges for Big Data LDN16:17 What to Expect at Big Data LDN18:37 Women in Data Lounge and Community Engagement20:32 Final Takeaways and Closing Remarks Also available on #Youtube, Apple and EM360tech.com#BigDataLDN #DataCommunity #WomenInData #InclusionInTech #TechEvents #DataInnovation | — | ||||||
| 8/28/25 | ![]() The 10-Year Journey: AI Madness, Data Governance, and the Future of Big Data | Big Data LDN (BDL), the ultimate data event of the year, celebrates its 10th anniversary this year. This year’s event is scheduled to take place on September 24 and 25, with a brand-new deep-dive conference held on September 23. In this episode of the Don't Panic, It's Just Data podcast, host Shubhangi Dua, Podcast Producer and B2B Tech Journalist at EM360Tech, speaks with Michael Ferguson, CEO of Intelligent Business Strategies and BDL Conference Chair, and Palesa Amadi, BDL Conference Manager. The speakers provide a sneak peek of the upcoming Big Data LDN conference. They discuss how AI affects data and analytics, and spotlight the importance of data governance. Additionally, they talk about the different theatres and sessions available at the conference. The conversation also shines a light on keynotes and notable speakers, the changes in data engineering and architecture, and the learning opportunities for attendees. AI Agents, Data Products in the LimelightGet ready for "AI madness", says the Conference Chair. This year's event will feature AI in many forms, from AI agents walking the aisles to its integration into almost every software product on the exhibition floor."It's going to be hectic," Ferguson expresses. He expects a strong push toward AI-driven data governance and data management. Such changes aim to simplify complex tasks and make tools easier for more users to access. Instead of just seeing copilots, attendees will experience multi-agent capabilities and even "AI apps."With 16 theatres and numerous speakers, the conference focuses on learning. Conference Manager describes how the content is tailored for everyone—from engineers and analysts to students."You're going to come to this to learn and to add to your knowledge, and of course, network," says Amadi. The show has changed its focus to include new theatres like Data Products and Streaming Analytics, along with Decision Automation. Some theatres have been renamed to highlight the growing influence of AI, such as the Data and AI Strategy Theatre.A new "Data for Good" theatre will feature sessions on sustainability and using data for unconventional purposes, like tackling global food needs.Unmissable Keynotes and SpeakersThe speaker lineup is a big draw, featuring well-known experts and thought leaders. Amadi is particularly excited about keynote speaker Zhamak Dehghani, who will discuss the "apparent redundancy" of the term data mesh. Dehghani, who wrote the viral white paper on this topic, will advocate for her views, as Amadi puts it. Another highlight will be a session on the key role of big data and AI in the gaming industry, led by Carly Taylor.The event's closing keynote will be delivered by the world-renowned physicist Professor Brian Cox. He will talk about quantum computing and the universe as a quantum computer. Beyond the Main EventFor those looking for more, Big Data LDN is hosting a new paid pre-show conference on September 23rd called Data-Driven LDN. This event is mainly designed for attendees in a high-level enterprise role, like CDOs and CTOs, and will be led by Mike Ferguson. It’s a chance to explore topics such as agentic AI, data governance, and data products in depth.Whether you’re an industry expert or a curious newcomer, the 10th anniversary of Big Data London promises to be an engaging and transformative experience, demonstrating how quickly data and AI are changing our world.For further information, tune into the podcast.TakeawaysBig Data LDN is celebrating its 10th anniversary.AI will dominate this year's conference with new technologies.Data governance now includes privacy, security, and compliance.The conference will feature various theatres focusing on specific topics.Sustainability will be a key theme in the Data for Good theatre.Speaker quality is ensured through a rigorous selection process.The event aims to cater to all levels of expertise.Real-world case studies will provide practical guidance.Data architecture is evolving to lower latency and improve access.Networking opportunities will be abundant at the conference.Chapters00:00 Introduction to Big Data London 202301:48 AI's Dominance at Big Data London06:15 Navigating the Conference: Theatres and Content15:02 Data Governance Challenges for AI20:00 Improving Speaker Quality and Content32:35 Key Technical Lessons from the Conference | — | ||||||
Showing 25 of 172
Sponsor Intelligence
Sign in to see which brands sponsor this podcast, their ad offers, and promo codes.
Chart Positions
1 placement across 1 market.
Chart Positions
1 placement across 1 market.
























