Live Jobs

Discover and Apply for Jobs

Data Engineer - Al Khobar

Permanent
Al Khobar, SA
09.08.2024

Job Title:Data Engineer - Al Khobar

Location: Al Khobar, SA

Employment Type: 

Permanent

Data Engineer - Al Khobar
Job Summary
We are looking for a talented Data Engineer. This role offers the opportunity to work closely with senior team members and develop hands-on expertise in data engineering. The focus will be on contributing to our data architecture and pipelines, particularly within the open Industrial DataOps platform Cognite Data Fusion (CDF) and modern Cloud technologies. Your work will play a crucial role in improving data reliability, efficiency, and integration across sectors like O&G, Power & Utilities, and Manufacturing.
Key Responsibilities
  • Collaborate with senior management and solution architects to contribute to data-oriented decisions across the organization.
  • Assist in the development and maintenance of scalable data pipelines, from data ingestion to analytics and machine learning.
  • Contribute to custom data pipelines and extractors, leveraging Python, Spark, and other technologies.
  • Support in shaping business decisions by offering insights into data technologies and architecture.
  • Be involved in the lifecycle of our data platforms, from architecture to production.
  • Assist in the evolution of data models, focusing on data discovery, mapping, and cleansing.
  • Adopt best practices for data extraction, under the guidance of senior team members.
  • Work closely with Data Scientists, Machine Learning Engineers, and other stakeholders to assist in operationalizing machine learning models.
Skills and Qualifications
  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Minimum of 3-5 years’ experience in data engineering, with a focus on big data and cloud solutions.
  • Solid understanding of ETL design, both batch and real-time, and data modelling.
  • Proficiency in Python, Spark, SQL, and modern data processing algorithms.
  • Experience with Containerization & Cloud-native tools (e.g., Docker, Kubernetes) and Public/Hybrid Cloud technologies (e.g., Google Cloud Platform, Azure, AWS) is a plus.
  • Familiarity with real-time data processing and event streaming technologies like Kafka.
  • Experience working with REST API
  • Understanding of DevOps practices and CI/CD methodologies.
  • Familiarity with Agile methodologies.