Responsibilities:
- Assist in the development and maintenance of a data warehouse solution
- Build data pipelines for efficient extraction, loading, and transformation (ETL) of data
- Develop and manage data models and warehouse schemes
- Test and monitor data pipelines to ensure performance and quality
- Maintain version control and documentation aligned with data definitions and business logic
- Collaborate with analysts, system owners, and internal teams to define data requirements and ensure accuracy
- Work in alignment with DevOps and cloud computing best practices where applicable
Requirements:
- An MQF Level 5 qualification in Computer Science, Data Science, Statistics, Analytics, or a related field with at least 2 years of relevant experience, OR
- An MQF Level 4 qualification in a relevant field with at least 3 years of work experience
- Preference will be given to candidates with:
- Experience in data warehousing, data modelling, and ETL processes
- Coding and data querying skills (ideally C# and T-SQL)
- Familiarity with DevOps practices, Git, CI/CD, and cloud platforms (preferably Azure)
- Exposure to data orchestration tools such as Azure Data Factory or Synapse Analytics
Benefits:
- Annual increments
- Additional allowances
- Performance bonus
- Health and life insurance
- Remote work available
- Flexible working hours