Responsibilities:
- Assist in the design and development of a data warehouse solution
 - Implement and maintain data pipelines for the extraction, transformation, and loading (ETL) of data from various sources
 - Develop and manage data warehouse schemas and data models
 - Perform version control and testing of data pipelines to ensure proper function and reliability
 - Monitor and ensure data quality across all pipelines and systems
 - Translate business logic and data definitions from source systems into well-documented and accurate warehouse structures
 - Collaborate with data analysts, internal tech teams, and external system owners to define requirements and optimize performance
 - Support data orchestration initiatives in line with modern data engineering practices
 
Requirements:
- An MQF Level 6 qualification in Computer Science, Data Science, Statistics, Analytics, or a related field with at least 3 years of relevant experience, OR
 - An MQF Level 5 qualification with a minimum of 4 years of experience in a similar role
 - Preference will be given to candidates with:
	
 - Experience in data warehousing, including data modelling and ETL processes
 - Proficiency in coding (preferably C# or similar) and data querying (preferably T-SQL)
 - Familiarity with DevOps practices (e.g., Git, CI/CD), cloud platforms (preferably Azure), and orchestration tools such as Data Factory or Azure Synapse
 
Benefits:
- Annual increments
 - Additional allowances
 - Performance bonus
 - Health and life insurance
 - Remote work available
 - Flexible working hours