Teile diesen Job!

Data Engineer - GCP & Looker (m/f/d)

Level VII

  • Berlin
  • Vollzeit
  • unbefristet
Vielfaltsmotiv mit Statement: Vier Personen unterschiedlicher Herkunft und Ausdrucksweise sitzen lächelnd auf bunten Stoffstapeln. Links im Bild steht in Weiß: „We believe in a world where everyone belongs, grows and shines.“

Unsere Mission

We are looking for an enthusiastic and communicative Data Engineer (m/f/d) to join our BI team in a hybrid work model in our Berlin headquarters.

Our team of 7 Data Engineers ensures that data-driven decisions power our operations. By collaborating closely with data scientists, pricing analysts, and business stakeholders, we build scalable solutions using the latest Google Cloud technologies to drive performance optimization. Our mission is to provide reliable, efficient, and cost-effective data infrastructure, leveraging the most advanced GCP tools and services to fuel innovation across the company.

Das wirst du meistern

  • Looker Development: Design, build, and maintain scalable data models using LookML. Ensure performance, reliability, and intuitive usability across self-serve analytics and curated insights
  • API & Webhook Integration: Develop and manage data ingestion pipelines that integrate REST APIs and webhooks into BigQuery, ensuring real-time or near-real-time delivery
  • Airflow Orchestration: Maintain and optimize our Airflow-based data pipelines deployed via Kubernetes, supporting scalable and automated workflows
  • Event-Driven Data Flows: Build event-driven processes using Pub/Sub, Eventarc, and Cloud Run functions to react to data changes and integrate across services
  • BigQuery Optimization: Ensure fast, efficient, and cost-effective query execution through partitioning, clustering, and autoscaling slot usage
  • Security & Observability: Implement data governance practices, enforce access controls, and configure robust monitoring/logging for data reliability and compliance

Our Tech Stack

  • Visualization/BI layer: Looker (for dashboards, semantic modeling)
  • Data & Processing: BigQuery, Airflow, Pub/Sub
  • Infrastructure: Kubernetes, Cloud Run
  • Programming: Python (ETL/ELT, APIs, automation)
  • Security & Cost Optimization: IAM, encryption, BigQuery slot reservation management, Policy Tags

Das bringst du mit

  • Visualization & Modelling: Proven experience with Looker, strong understanding of LookML, explores, persistent derived tables (PDTs), and Looker permissions
  • Event-Based Architecture: Experience working with Google Cloud Pub/Sub, Eventarc, Cloud Run, and webhook-triggered processes
  • Pipeline Management: Expertise with Apache Airflow, DAG designing, refactor and debugging. Experience monitoring data pipeline SLAs and implementing automated alerting for data quality issues
  • Data Warehousing: Advanced proficiency in BigQuery, including cost-efficient query design and slot reservation strategies
  • Programming: Strong Python skills for building connectors, transformations, and automations
  • Cloud Native: Hands-on experience with GCP and relevant tools like IAM, Policy Tags, and dbt
  • Communication & Mindset: Excellent communication and collaboration skills within cross-functional teams, paired with an ownership mindset that drives proactive identification and resolution of technical issues
  • Please note: We can only accept applications with a valid work permit

Wir freuen uns auf deine Bewerbung!

Sende deine Bewerbung bitte an recruitment@momox.biz

Online since: 04.08.2025