Emplois en direct

Découvrez et Postulez pour des emplois

Data Engineer - GCP (m/f/d)

Permanent
Riyadh, Saudi Arabia
11.03.2025
This role offers a dynamic opportunity to contribute to cutting-edge data engineering projects, working closely with senior team members to develop hands-on expertise. You will play a crucial role in enhancing our data architecture and pipelines, leveraging Google Cloud Platform (GCP). 

Key Responsibilities:
  • Collaborate with senior management and solution architects to contribute to strategic data-oriented decisions.
  • Develop and maintain scalable data pipelines, from data ingestion to analytics and machine learning, with a strong focus on GCP services.
  • Design and implement custom data pipelines and extractors utilizing Python, Spark, and GCP tools.
  • Provide insights into data technologies and architecture to support informed business decisions.
  • Participate in the full lifecycle of our data platforms, from architectural design to production deployment, within a GCP environment.
  • Contribute to the evolution of data models, emphasizing data discovery, mapping, and cleansing.
  • Implement and adhere to best practices for data extraction, with guidance from senior team members.
  • Collaborate closely with Data Scientists, Machine Learning Engineers, and other stakeholders to operationalize machine learning models within GCP.
  • Build and maintain infrastructure utilizing Google Kubernetes Engine (GKE) and other containerization technologies.
Skills and Qualifications:
  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Minimum of 2 years of professional experience as a Data Engineer.  
  • Minimum of 1 year of hands-on experience with Google Cloud Platform (GCP).
  • GCP Certification: Google Cloud Certified Associate Cloud Engineer or Google Cloud Certified Professional Data Engineer is required.
  • Proficiency in GCP services, including BigQuery, Looker, and Cloud Storage.
  • Solid understanding of ETL design (batch and real-time) and data modeling.
  • Proficiency in Python, Spark, and SQL.
  • Strong experience with containerized applications, including Docker and Kubernetes, specifically Google Kubernetes Engine (GKE).
  • Familiarity with real-time data processing and event streaming technologies like Kafka.
  • Experience working with REST APIs.
  • Understanding of DevOps practices and CI/CD methodologies.
  • Familiarity with Agile methodologies.


#LI-JS1

Êtes-vous prêt pour demain?

Inscrivez-vous en ligne - cela ne prend que 10 minutes.