GCP DevOps Engineer managing Google Cloud infrastructure and CI/CD pipelines. Collaborating with teams to automate processes and optimize resources within a growing organization.
Responsibilities
Design, implement, and manage GCP resources, including Virtual Machines (VMs), storage (Cloud Storage, Filestore), networking (VPC, Load Balancers, DNS), databases (Cloud SQL, Firestore), and other platform services.
Configure and maintain scalable, secure, and highly available cloud infrastructure on GCP.
Monitor GCP resource usage, performance, and costs, implementing optimization strategies to ensure efficiency and budgetary compliance.
Implement and enforce GCP security best practices, including Identity and Access Management (IAM), network security controls (firewalls), and data encryption.
Manage and allocate resources efficiently across GCP projects.
Develop, maintain, and manage infrastructure using Terraform to ensure consistent, repeatable, and version-controlled provisioning of GCP resources.
Automate infrastructure provisioning and configuration tasks through Terraform modules and configurations.
Design, implement, and maintain robust CI/CD pipelines using Tekton for automated build, test, and deployment of applications and infrastructure on Kubernetes.
Implement and advocate for DevOps principles, fostering collaboration between development and operations teams.
Maintain comprehensive documentation of configurations, processes, and procedures.
Requirements
3+ years of experience in a GCP Administrator, DevOps Engineer, or similar role.
Proven experience with Google Cloud Platform (GCP) services, including Compute Engine, GKE, Cloud Storage, VPC, Cloud SQL, IAM, and Stackdriver (Cloud Monitoring/Logging).
Extensive hands-on experience with Infrastructure as Code (IaC) using Terraform for provisioning and managing cloud resources.
Strong experience in designing, implementing, and managing CI/CD pipelines, specifically with Tekton.
Proficiency in scripting languages such as Python, Bash, or Go.
Solid understanding of containerization technologies (Docker) and orchestration (Kubernetes, GKE).
Experience with version control systems, especially Git.
Strong understanding of networking concepts (TCP/IP, DNS, Load Balancing) and security best practices in a cloud environment.
Excellent problem-solving, analytical, and communication skills.
Senior DevOps Engineer at SimCorp managing cloud environments and automating builds using Azure. Collaborating with cross - functional teams to ensure high service availability and compliance.
DevOps Senior Software Engineer at SimCorp developing high - quality software solutions for financial technology. Responsible for mentoring junior engineers and solving complex technical challenges.
DevOps Engineer designing, building, and operating software development infrastructure for CodeMettle. Leading automation and best practices to enhance value delivery across teams.
DevOps Engineer maintaining scalable infrastructure for VOX's telecom services. Implementing automation and CI/CD pipelines in a fast - paced environment with significant growth potential.
DevOps Engineer focused on designing and managing CI/CD pipelines using Azure DevOps. Collaborating with teams for application deployment and ensuring DevSecOps practices.
DevOps Engineer working closely with engineering and security teams to optimize CI/CD pipelines and manage infrastructure. Ensuring security and compliance for mission - critical financial applications.
Build and scale cloud infrastructure that powers Heidi's healthcare AI platform. Work with AWS and Azure while enhancing automation and reliability in an innovative healthtech startup.
Infrastructure - as - Code DevOps Engineer designing and managing cloud - native platforms at Vodafone. Collaborating with agile teams for digital transformation and business success.
Director of Data Engineering leading a strategic DevOps team within Enterprise AI. Balancing leadership with hands - on expertise to enable AI technology adoption.
Join a Data Engineering Team as a Senior DevOps to support multiple Data & AI initiatives. Utilize cloud technologies and enhance data pipelines in a collaborative environment.