Ghost in the data
  • Home
  • About
  • Posts
  • Tags
  • AI
  • AI Agents
  • AI Business Applications
  • AI Communication
  • AI Concepts
  • AI Productivity
  • AI Prompting
  • AI Workflows
  • Airflow
  • Apache Airflow
  • Apache Iceberg
  • Automation
  • AVRO
  • Bedrock Edition
  • Blue-Green Deployment
  • Business Value
  • Career Advice
  • Career Growth
  • Chapter Lead
  • ChatGPT
  • CI/CD
  • Claude
  • Cloud Gaming
  • Code Review
  • Communication
  • ConceptualDataModeling
  • Continuous Learning
  • CSV
  • Culture
  • Data Architecture
  • Data Culture
  • Data Engineering
  • Data Governance
  • Data Impact
  • Data Leadership
  • Data Modeling
  • Data Modelling
  • Data Pipeline
  • Data Quality
  • Data Reliability
  • Data Solutions
  • Data System Resilience
  • Data Testing
  • Data Transformation
  • Data Vault
  • Data Warehouse
  • Data Warehouse Architecture
  • Database Design
  • DataEngineering
  • DataPipelines
  • DBT
  • Delta-Lake
  • Development
  • Development Tools
  • Emotional Intelligence
  • EmpatheticDesign
  • Employee Engagement
  • Employee Productivity
  • Engineering Career
  • ETL
  • ETL Pipeline
  • Family Gaming
  • Feedback
  • File Formats
  • GCP
  • Git
  • GitBash
  • Github
  • GitHub Actions
  • Hiring Strategies
  • Incident Response
  • Industry Trends
  • Inspirational Quote
  • Intergroup Conflict
  • Interviews
  • Journal
  • Journaling Techniques
  • JSON
  • Language Models
  • LLM
  • LLM Interaction
  • MacOS
  • Management
  • Mentorship
  • Mindfulness Practices
  • Minecraft
  • Onboarding
  • One-on-One Meetings
  • ORC
  • Parquet
  • Performance Optimization
  • Personal Growth
  • Pipeline
  • PostegreSQL
  • Problem Solving
  • Professional Development
  • Professional Growth
  • Promotion
  • Python
  • RAG
  • Recruitment
  • Remote Work
  • RequirementGathering
  • Risk Management
  • Robbers Cave Experiment
  • Roleplaying
  • Schema Evolution
  • Self-Reflection
  • Server Setup
  • SQL
  • SQL Standards
  • SSH
  • SSH Keys
  • Staff Engineer
  • Stakeholder Engagement
  • Stakeholder Management
  • StakeholderManagement
  • Star Schema
  • Success Habits
  • Talent Acquisition
  • Team Collaboration
  • Team Enablement
  • Technical Assessment
  • Technical Leadership
  • Tools and Access
  • Trust Building
  • UV
  • UV Package Manager
  • Value Creation
  • Vector Databases
  • Virtual Environments
  • Visualization
  • VSCode
  • WAP Pattern
  • Windows
  • Workplace Communication
  • Workplace Relationships
  • Write-Audit-Publish
  • Zsh
Hero Image
Data Vault Data Modeling with Python and dbt

Introduction Data Vault is a data modeling technique that is specifically designed for use in Data Warehouses. It is a hybrid approach that combines the best elements of 3rd Normal Form (3NF) and Star Schema to provide a flexible and scalable data modeling solution. Hubs, Links, Satellites A Data Vault consists of three main components: Hubs, Links, and Satellites. Hubs are the backbone of the Data Vault architecture and represent the entities within the data model. They are the core data elements and contain the primary key information.

  • Data Vault
  • Python
  • DBT
  • ETL
  • Data Warehouse Architecture
Sunday, February 26, 2023 Read