
Acerca de
Senior DevOps Engineer
Responsibilities
-
Design and develop infrastructure supporting scalable and reliable data pipelines on cloud platforms (AWS/GCP/Azure) using Kubernetes, Docker, and auto-scaling clusters.
-
Develop and maintain CI/CD pipelines for both applications and data pipelines (ETL/ELT) with high stability and automation.
-
Manage and optimize data flows across systems including databases (PostgreSQL, MySQL), data warehouses (BigQuery, Redshift, Snowflake), and storages (S3, GCS).
-
Optimize and troubleshoot performance issues in both infrastructure and data pipelines to ensure maximum efficiency.
-
Set up monitoring and alerting systems for infrastructure, pipelines, and cost management.
-
Implement Infrastructure as Code (IaC) such as Terraform, Helm, and Ansible to manage and provision infrastructure.
-
Collaborate with Data Engineers, Data Scientists, and Developers to support data-related projects.
Qualifications
-
Bachelor’s degree in IT, Computer Science, Engineering, or related field.
-
3+ years of experience as a DevOps Engineer/DataOps Engineer.
-
Strong experience with containerization technologies (Docker, Kubernetes).
-
Hands-on experience with cloud services (AWS/GCP/Azure), particularly data-related services such as: AWS (ECS, EKS, S3, Glue, Redshift, Step Functions), GCP (BigQuery, Pub/Sub, Dataflow, Composer), Azure (Data Factory, Synapse).
-
Experience with CI/CD tools (Bitbucket Pipelines, GitHub Actions, GitLab CI/CD, Jenkins).
-
Proficiency with monitoring tools such as Prometheus, Grafana, ELK Stack, Cloud Monitoring.
-
Knowledge of data pipeline & workflow orchestration frameworks (Airflow, Dagster, Prefect).
-
Experience with messaging/queue systems (Kafka, RabbitMQ, Pub/Sub).
-
Ability to explain complex concepts in a clear and simple manner.
-
Demonstrated leadership and project management skills.
-
Strong collaboration skills in cross-functional teams.
-
Good command of English communication (verbal and written).