Ghost in the data
  • Home
  • About
  • Posts
  • Topics
  • Resources
  • RSS
  • Categories
  • AI Development
  • Analytics Engineering
  • Artificial Intelligence
  • AWS
  • Banking
  • Best Practices
  • Big Data
  • Business Technology
  • Career Development
  • Career Growth
  • Cloud Architecture
  • Cloud Computing
  • Cloud Infrastructure
  • Communication
  • Conflict Resolution
  • Data Architecture
  • Data Culture
  • Data Engineering
  • Data Ethics
  • Data Governance
  • Data Modeling
  • Data Modelling
  • Data Pipelines
  • Data Privacy
  • Data Quality
  • Data Storage
  • Data Warehousing
  • Database Design
  • Dbt
  • Delta-Lake
  • Development
  • Development Tools
  • DevOps
  • Employee Engagement
  • Gaming Servers
  • Google Cloud Platform
  • Hiring
  • Industry Analysis
  • Interviews
  • IT Management
  • Leadership
  • Life Hacks
  • Mindfulness
  • Minecraft
  • Personal
  • Personal Development
  • Personal Finance
  • Pipeline
  • Pipeline Design
  • Productivity
  • Professional Development
  • Professional Growth
  • Promotion
  • Psychology
  • Python
  • Python Tools
  • Setup Guide
  • SQL
  • Stakeholder Management
  • Team Building
  • Team Culture
  • Team Management
  • Technical Architecture
  • Technology Trends
  • Tutorial
  • User Experience
  • Version Control
  • Workplace Dynamics
Hero Image
Why Your Pipeline Finishes Later Every Month

Let me tell you about a graph that changed how I think about data engineering. A junior engineer on my team — let’s call her Priya — had been tracking something nobody asked her to track. Every morning for two months, she’d noted the timestamp when our main analytics pipeline completed. She wasn’t trying to make a point. She was just curious, because the finance team kept mentioning their dashboards weren’t ready when they arrived at 8 AM anymore.

  • Snowflake
  • AWS
  • Airflow
  • Pipeline Optimization
  • dbt
  • Data Freshness
Saturday, April 18, 2026 Read
Hero Image
Stop Building Salesforce Integrations From Scratch

Let me tell you about Marcus. Marcus was on a team I led a few years back. Sharp, motivated, the kind of engineer who actually read documentation before writing code. When the business asked us to get Salesforce data into our warehouse, Marcus volunteered. He’d done API work before. He figured a few weeks, tops. He scoped it carefully. Built a Python service that authenticated via OAuth, pulled Account, Contact, and Opportunity objects through the Bulk API, flattened the nested JSON into relational tables, handled pagination, managed rate limits. Wrote solid tests. Documented everything. The kind of work you’d point to in a code review and say this is how it’s done.

  • Data Engineering
  • Snowflake
  • OpenFlow
  • Salesforce
  • API Integration
  • Schema Evolution
  • Fivetran
  • Data Pipelines
Saturday, April 4, 2026 Read
Hero Image
Write-Audit-Publish with Iceberg Tables in Snowflake

It was a Tuesday afternoon when the analyst pinged me on Microsoft Teams: “Hey, the Total Portfolio numbers just jumped 40% overnight. Did we land a whale?” We hadn’t. What actually happened was more mundane and significantly more painful. A schema change in the source system introduced a currency conversion bug. Our pipeline dutifully loaded the corrupted data into production at 3 AM, the dashboards updated by 6 AM, and the Department Head opened her morning report to numbers that looked like champagne-worthy growth.

  • Apache Iceberg
  • Snowflake
  • WAP Pattern
  • Data Quality
  • SQL
  • Lakehouse
  • Data Pipelines
  • Best Practices
Friday, February 27, 2026 Read