Develop reusable workflows for data ingestion, quality, transformation, and optimization.
Build scalable data pipelines utilizing various tools for extraction, transformation, and loading.
Deploy data solutions into production environments with a focus on efficiency and reliability.
Migrate data from legacy data warehouses while adhering to cloud architecture principles.
Automate data flow processes to enhance consumption and accessibility.
HERE'S WHAT YOU WILL NEED:
Advanced proficiency in AWS Redshift.
Expert proficiency in Platform Engineering.
A minimum of 3 year of experience in relevant related skills.
Bachelor's Degree in relevant field of studies.
BONUS POINTS IF YOU HAVE:
Intermediate proficiency in Data Architecture Principles.
Intermediate proficiency in Database Architecture.
Intermediate proficiency in PySpark.
Laporkan lowongan