Senior Data Engineer - Python & Gcp Expertise (M/F)

ITDS

20.04.2026 | | Referência: 2419388


  PARTILHAR






Empresa:

ITDS


Descrição da Função

Senior Data Engineer - Python & GCP Expertise


Ignite the future of data - architect scalable platforms and drive transformative insights!

Lisbon-based opportunity with hybrid work model (3 days remote per week).

As a Senior Data Engineer, you will be working for our client, a leading financial services group within a prominent French banking institution. You will play a pivotal role in designing, building, and maintaining cutting-edge data pipelines and platforms on Google Cloud Platform (GCP), enabling data-driven decision making across the organization. Join a team committed to innovation and excellence, shaping the future of data management in the financial industry.

Your main responsibilities:

  • Design, develop, and optimize end-to-end data pipelines (ELT/ETL) to efficiently process large volumes of data from diverse sources.
  • Build robust data models and schemas on GCP utilizing BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and related services.
  • Write Python code to implement data transformations, orchestrate workflows, automate processes, and ensure data quality.
  • Collaborate with Data Scientists, Analysts, and Stakeholders to understand data requirements and deliver scalable solutions.
  • Integrate testing and validation processes, including UAT and QA, to ensure the reliability and accuracy of data pipelines.
  • Monitor data pipelines performance and implement logging, alerting, and governance best practices for data security and compliance.
  • Maintain detailed technical documentation, data dictionaries, runbooks, and test plans.
  • Stay current with GCP offerings, best practices, and innovative data engineering patterns.


You're ideal for this role if you have:

  • 5+ years of professional experience in data engineering or related fields.
  • Strong Python development skills, including design patterns, testing, and performance optimization.
  • Practical experience with Google Cloud Platform (BigQuery, Cloud Storage, Dataflow/Beam, Dataproc, Pub/Sub, Cloud Composer/Airflow).
  • Hands-on experience building and maintaining scalable data pipelines, data lakes, or data warehouses.
  • Proficiency with ETL/ELT concepts, data modeling, and schema design (star/snowflake schemas, normalization).
  • Familiarity with orchestration tools like Airflow or Cloud Composer.
  • Experience with version control systems such as Git.
  • Excellent communication skills in English and French (minimum B2 level).
  • Ability to work independently and collaboratively in agile environments.


It is a strong plus if you have:

  • Certifications or strong working knowledge of GCP (e.g., Professional Data Engineer).
  • Experience with streaming data tools like Kafka or Pub/Sub and real-time analytics.
  • Knowledge of data visualization tools such as Looker, Tableau, or Power BI.
  • Exposure to regulated industries and data privacy regulations (GDPR/CCPA).
  • Familiarity with Control-M or other automation/orchestration tools.
  • Experience with Spark/Databricks, BI tools, or enterprise scheduling solutions.


Language Required for the role:

  • Fluent in English and French (speaking, reading, and writing)


Eligibility to work in Europe:

  • Only candidates with an existing legal right to work in the European Union will be considered for this role.


Observações

Porto (Portugal)





EMPREGOS SEMELHANTES




ÚLTIMOS EMPREGOS