Work Experience: 3 to 7 years in data engineering or related fields.
GCP Services: Proficient with at least four GCP services (Data Flow,
Data Proc, Pub Sub, Big Query, Cloud Functions, Composer, GCS).
Programming Skills: Hands-on experience in Spark/Scala (Python/Java).
ETL/ELT Pipelines: Proven experience in building ETL/ELT pipelines.
Data Engineering Knowledge: Understanding of Data Lakes, Data
Warehouses, integration, and migration.
Communication: Excellent written and verbal skills.
Version Control: Experience with version control tools
(Git/Bitbucket/Code Commit).