stack-profile:etl-reverse-etl
ETL / Reverse ETL (Python, Airbyte, dbt, PostgreSQL, Airflow) overview
A bidirectional data integration platform combining Airbyte for extract- load (EL) from hundreds of SaaS sources, dbt for SQL transformations (T), and a reverse ETL pipeline that pushes enriched data back into operational tools like CRMs and marketing platforms. Airflow orchestrates the full cycle: Airbyte syncs land raw data in PostgreSQL, dbt models refine it into business entities, and Python scripts push curated segments back to destination APIs. Pandas handles data manipulation for custom reverse-ETL connectors. Docker Compose runs Airbyte, Airflow, and PostgreSQL locally. The tradeoff is managing schema drift across dozens of source connectors and the complexity of bidirectional sync conflict resolution, but the platform eliminates data silos between analytics and operations.
Attributes
Outgoing edges
- domain:data-engineering·DomainData Engineering
- domain:business-intelligence·DomainBusiness Intelligence
- language:python·LanguagePython
- tool:airbyte·ToolAirbyte
- tool:airflow·ToolApache Airflow
- library:sqlalchemy·LibrarySQLAlchemy
- library:pandas·Librarypandas
- library:boto3·LibraryBoto3
- tool:docker·ToolDocker
- tool:docker-compose·ToolDocker Compose
- language:sql·LanguageSQL
- workflow:data-pipeline-deployment·WorkflowData Pipeline Deployment
- workflow:etl-pipeline-cost-optimization·WorkflowETL Pipeline Cost Optimization
- skill-area:etl-pipelines·SkillAreaETL Pipelines
- skill-area:python-data-pipelines·SkillAreaPython Data Pipelines
- skill-area:data-quality·SkillAreaData Quality
- skill-area:schema-evolution·SkillAreaSchema Evolution
- skill-area:data-governance·SkillAreaData Governance
- role:data-engineer·RoleData Engineer
- role:analytics-engineer·RoleAnalytics Engineer
- role:data-analyst·RoleData Analyst