Ghost in the data
  • Home
  • About
  • Posts
  • Topics
  • Resources
  • RSS
  • Tags
  • 2026 Trends
  • AI
  • AI Agents
  • AI Bubble
  • AI Business Applications
  • AI Communication
  • AI Concepts
  • AI Ethics
  • AI Productivity
  • AI Prompting
  • AI Tools
  • AI Workflows
  • Airflow
  • Analytics
  • AnalyticsEngineering
  • Anonymization
  • Apache Airflow
  • Apache Iceberg
  • Architecture
  • Athena
  • Automation
  • AVRO
  • AWS
  • AWS Glue
  • BankingData
  • Bedrock Edition
  • Best Practices
  • BigData
  • Blue-Green Deployment
  • Budgeting
  • Burnout
  • Business Case
  • Business Value
  • Business-Communication
  • Career Advice
  • Career Development
  • Career Growth
  • Career Planning
  • Career Strategy
  • Change Management
  • Chapter Lead
  • ChatGPT
  • CI/CD
  • Claude
  • Claude-Code
  • Cloud Computing
  • Cloud Gaming
  • Code Review
  • Collaboration
  • Communication
  • ConceptualDataModeling
  • Continuous Learning
  • ContinuousIntegration
  • Cost Optimization
  • CSV
  • Culture
  • Data Architecture
  • Data Contracts
  • Data Culture
  • Data Engineering
  • Data Ethics
  • Data Governance
  • Data Impact
  • Data Ingestion
  • Data Leadership
  • Data Modeling
  • Data Modelling
  • Data Ownership
  • Data Pipeline
  • Data Pipelines
  • Data Platforms
  • Data Quality
  • Data Reliability
  • Data Solutions
  • Data System Resilience
  • Data Teams
  • Data Testing
  • Data Transformation
  • Data Validation
  • Data Vault
  • Data Warehouse
  • Data Warehouse Architecture
  • Data Warehousing
  • Database Design
  • DataDemocratization
  • DataEngineering
  • Datafold
  • DataGovernance
  • DataMinimization
  • DataModeling
  • DataPipelines
  • DataPrivacy
  • DataQuality
  • DataTools
  • DataValidation
  • DataWarehouse
  • Dbt
  • Decision Making
  • Delta Lake
  • Development
  • Development Tools
  • DevOps
  • Dimensional Modeling
  • DimensionalModeling
  • Documentation
  • DuckDB
  • Emergency Fund
  • Emotional Intelligence
  • EmpatheticDesign
  • Employee Engagement
  • Employee Productivity
  • Engineering Career
  • ETL
  • ETL Pipeline
  • Family Gaming
  • Feedback
  • File Formats
  • Financial Crisis
  • Financial Independence
  • Frameworks
  • Friendship
  • Future of Work
  • GCP
  • GDPR
  • Git
  • GitBash
  • GitHub
  • GitHub Actions
  • Grief
  • Hiring Strategies
  • Historical Load
  • Idempotency
  • Incident Response
  • Industry Trends
  • Innovation
  • Inspirational Quote
  • Intergroup Conflict
  • Interviews
  • Job Security
  • Journal
  • Journaling Techniques
  • JSON
  • Junior Engineer
  • Kimball
  • Kimball Methodology
  • Lakehouse
  • Lambda
  • Language Models
  • Leadership
  • Legacy Systems
  • Life
  • LLM
  • LLM Interaction
  • Loss
  • MacOS
  • Management
  • Mental Health
  • Mentorship
  • Mindfulness Practices
  • Minecraft
  • Moral Development
  • Onboarding
  • One-on-One Meetings
  • OpenSource
  • ORC
  • Organizational Culture
  • Parquet
  • Performance Optimization
  • Personal
  • Personal Growth
  • Pipeline
  • Pipeline Design
  • PostegreSQL
  • Pragmatism
  • Presentation-Skills
  • Problem Solving
  • Production Issues
  • Productivity
  • Professional Development
  • Professional Growth
  • Professional Relationships
  • Professional-Skills
  • Promotion
  • Psychological Safety
  • Public-Speaking
  • Python
  • RAG
  • Recruitment
  • Redundancy
  • Refactoring
  • Remote Work
  • Reputation
  • RequirementGathering
  • RetentionPolicies
  • RFC 4180
  • Risk Management
  • Robbers Cave Experiment
  • ROI
  • Roleplaying
  • S3
  • SCD
  • SCD Type 2
  • Schema Drift
  • Schema Evolution
  • Self-Awareness
  • Self-Reflection
  • Server Setup
  • ServiceDesign
  • ShadowIT
  • Snowflake
  • Soft Skills
  • SQL
  • SQL Standards
  • Sql-Agents
  • Sql-Validation
  • SSH
  • SSH Keys
  • Staff Engineer
  • Stakeholder Engagement
  • Stakeholder Management
  • StakeholderManagement
  • Star Schema
  • Starburst
  • Step Functions
  • Strategy
  • Strengths
  • Success Habits
  • Talent Acquisition
  • Team Building
  • Team Collaboration
  • Team Culture
  • Team Enablement
  • Team-Management
  • Technical Assessment
  • Technical Debt
  • Technical Leadership
  • Technical Strategy
  • Testing
  • Tools and Access
  • Trino
  • Trust
  • Trust Building
  • Trust Crisis
  • UserExperience
  • UV
  • UV Package Manager
  • Value Creation
  • Vector Databases
  • Virtual Environments
  • Visualization
  • Vocal-Techniques
  • VSCode
  • WAP Pattern
  • Wellbeing
  • Windows
  • Work-Life Balance
  • Workplace Communication
  • Workplace Relationships
  • Workplace Stress
  • Write-Audit-Publish
  • Zsh
Hero Image
You Don't Need Permission to Fix Your Data

Let me tell you about a junior engineer called Sam. Sam had been on the team about four months when I noticed something in a pull request. Tucked between two routine model changes was a new schema.yml entry — five accepted_values tests on a column called customer_status that had been silently accumulating fourteen different spellings of “active” for the better part of a year. Nobody asked Sam to do this. It wasn’t in a sprint. There was no Jira ticket. Sam had just been working in that part of the warehouse, noticed the mess, and decided to clean it up on the way through.

  • Data Quality
  • dbt
  • SQL
  • Testing
  • Documentation
  • Junior Engineer
  • Career Growth
  • Psychological Safety
Saturday, March 21, 2026 Read
Hero Image
Your Data Model Isn't Broken, Part I: Why Refactoring Beats Rebuilding

In the early 2000’s - Netscape’s decision to rewrite their browser from scratch was the single worst strategic mistake a software company could make. At the time, Netscape was winning. They had the dominant browser. They had market share. They had momentum. And then they decided the codebase was too messy, too tangled, too hard to work with — so they threw it all away and started over. Navigator 4.0 became the foundation for a rewrite that would eventually ship as version 6.0. There was no 5.0. Three years of development. No shipping product. And while Netscape’s engineers were busy building their beautiful new browser in a vacuum, Internet Explorer ate their lunch, their dinner, and most of their market share.

  • Data Engineering
  • Refactoring
  • Data Warehousing
  • Technical Debt
  • Snowflake
  • dbt
  • Legacy Systems
  • Data Quality
Saturday, March 14, 2026 Read
Hero Image
12 Steps to Better Data Engineering

Let me tell you about the moment I stopped trusting architecture diagrams. I was three days into a new role, getting up to speed with the data team. Smart people. Modern stack. On paper, everything looked right. They walked me through a beautiful data platform diagram: clean lines, labelled layers, colour-coded domains. It looked like something you’d see in a data conference. Then I asked a question that changed everything: “Can you rebuild your finance table from scratch right now?”

  • Data Engineering
  • dbt
  • Snowflake
  • GitHub Actions
  • AWS
  • Data Quality
  • CI/CD
  • Data Contracts
Saturday, March 7, 2026 Read
Hero Image
The CSV Test Suite Nobody Writes

In October 2020, roughly 16,000 positive COVID-19 test results vanished from the UK’s public health reporting for nearly a week. Not because the tests weren’t run. Not because the labs didn’t report them. The results were collected, transmitted, and received — inside CSV files. The problem? Public Health England was importing those CSV files into Microsoft Excel’s legacy .xls format. The format has a hard row limit of 65,536. When the files grew past that limit, Excel didn’t throw an error. It didn’t warn anyone. It just silently dropped the extra rows. Sixteen thousand people who tested positive for a deadly virus during a second wave went untraced. An estimated 50,000 of their contacts were never notified. And the system this happened in? Part of a £12 billion Test and Trace programme.

  • CSV
  • Data Quality
  • Testing
  • RFC 4180
  • Python
  • Data Engineering
  • Data Pipelines
Wednesday, March 4, 2026 Read
Hero Image
Write-Audit-Publish with Iceberg Tables in Snowflake

It was a Tuesday afternoon when the analyst pinged me on Microsoft Teams: “Hey, the Total Portfolio numbers just jumped 40% overnight. Did we land a whale?” We hadn’t. What actually happened was more mundane and significantly more painful. A schema change in the source system introduced a currency conversion bug. Our pipeline dutifully loaded the corrupted data into production at 3 AM, the dashboards updated by 6 AM, and the Department Head opened her morning report to numbers that looked like champagne-worthy growth.

  • Apache Iceberg
  • Snowflake
  • WAP Pattern
  • Data Quality
  • SQL
  • Lakehouse
  • Data Pipelines
  • Best Practices
Friday, February 27, 2026 Read
Hero Image
Healing Tables: When Day-by-Day Backfills Become a Slow-Motion Disaster

It was 2 AM on a Saturday when I realized we’d been loading data wrong for six months. The situation: a customer dimension with three years of history needed to be backfilled after a source system migration. The previous team’s approach was straightforward—run the daily incremental process 1,095 times, once for each day of history. They estimated three weeks to complete. What they hadn’t accounted for was how errors compound. By the time I looked at the data, we had 47,000 records with overlapping date ranges, 12,000 timeline gaps where customers seemed to vanish and reappear, and an unknowable number of missed changes from when source systems updated the same record multiple times in a single day.

  • SCD
  • Historical Load
  • dbt
  • SQL
  • Data Quality
  • Dimensional Modeling
  • Delta Lake
  • Best Practices
Saturday, February 7, 2026 Read
Hero Image
Context Engineering: The New Must-Have Skill for Data Engineers

Last year I watched a colleague ask AI to help write a dbt model. The AI spit out perfectly functional SQL—clean syntax, proper CTEs, the works. Looked great. Then I noticed the table would eventually hold 800 million rows. No partitioning. No clustering. Just a raw, unoptimised heap waiting to turn into a query performance nightmare (that would likely become my nightmare to fix). The engineer wasn’t at fault. The AI wasn’t at fault either, really. The AI simply didn’t know that our environment clusters large tables by date. It didn’t know our team’s conventions around incremental models. It couldn’t know, because nobody had told it.

  • AI
  • dbt
  • Data Quality
  • SQL
  • Productivity
  • VSCode
  • Claude
Saturday, January 31, 2026 Read
Hero Image
That Tuesday Morning When I Finally Fixed Our Ten-Minute Queries

The Ten-Minute Query I’m sitting at my laptop on a Tuesday morning, waiting. The progress bar on my screen says ‘Query running… 4 minutes, 37 seconds.’ I lean back in my chair and let out this long sigh that probably says more than I intended. My manager walks past my desk. She glances at my screen, and I can see that look—the one that says she already knows what I’m about to tell her. I didn’t need to explain.

  • AWS Glue
  • Dimensional Modeling
  • Kimball Methodology
  • Data Quality
  • ETL
  • Write-Audit-Publish
  • Apache Iceberg
  • Step Functions
  • SCD Type 2
Friday, January 16, 2026 Read
Hero Image
The Guerrilla Guide to Data Engineering Interviews

The Scenario That Changes Everything Picture this: You’re sitting in an interview room—or more likely these days, staring at a Zoom window with your carefully curated bookshelf background—and the interviewer asks you about data quality. “Tell me about your experience with data quality,” they say. You have two choices. Choice A: “Data quality is really important in data engineering. It involves ensuring data is accurate, complete, consistent, and timely. I believe strongly in implementing data quality checks throughout the pipeline.”

  • Interviews
  • Career Growth
  • Technical Assessment
  • SQL
  • Data Modeling
  • Problem Solving
  • Delta Lake
  • dbt
  • Data Quality
Sunday, January 11, 2026 Read
Hero Image
The Four Stages of Data Quality: From Hidden Costs to Measurable Value

This is the fundamental problem with data quality. You know it matters. Everyone knows it matters. But until you can quantify the impact, connect it to business outcomes, and build a credible business case, it remains this abstract thing that’s important but never urgent enough to properly fund. I wrote a practical guide to data quality last week that walks through hands-on implementation—the SQL queries, the profiling techniques, the actual mechanics of finding and fixing data issues. Think of that as the “how to use the tools” guide. This article is different. This is the “why these tools matter and how to convince your organization to actually use them” guide.

  • Data Quality
  • ROI
  • Business Case
  • Data Governance
  • Strategy
  • Frameworks
Monday, November 24, 2025 Read
Hero Image
When Your Data Quality Fails at 9 PM on a Friday

When everything goes wrong at once It’s 9 PM on a Friday. You’re halfway through your second beer, finally relaxing after a brutal week. Your phone buzzes. Then it buzzes again. And again. The support team’s in full panic mode, your manager’s calling, and somewhere in Melbourne, two very angry guests are standing outside the same Airbnb property—both holding confirmation emails that say the place is theirs for the weekend.

  • Data Quality
  • SQL
  • Database Design
  • Data Validation
  • Testing
  • Data Engineering
  • Production Issues
Saturday, November 22, 2025 Read
Hero Image
Streamlining Data Pipeline Reliability: The Write-Audit-Publish Pattern

Introduction: Why Safe Data Pipelines Matter In the world of data engineering, there’s a constant challenge we all face: how do we ensure our production data remains reliable and error-free when deploying updates? Anyone who’s experienced the cold sweat of a bad deployment affecting critical business data knows this pain all too well. Enter the Write-Audit-Publish pattern—a robust approach that can significantly reduce the risk of data pipeline failures. This pattern, which shares DNA with the well-known Blue-Green deployment strategy from software engineering, creates a safety net that can save your team countless hours of troubleshooting and emergency fixes.

  • Write-Audit-Publish
  • WAP Pattern
  • Airflow
  • Data Reliability
  • Blue-Green Deployment
  • Data Quality
  • Python
Sunday, May 18, 2025 Read
  • ««
  • «
  • 1
  • 2
  • »
  • »»