Devops Engineer – Gcp

Data Engineer - Cdw

Betsol

Título del trabajo: Devops Engineer – Gcp

Compañía: Betsol

Descripción de funciones: Company Description BETSOL is a cloud-first digital transformation and data management company offering products and IT services to enterprises in over 40 countries. BETSOL team holds several engineering patents, is recognized with industry awards, and BETSOL maintains a net promoter score that is 2x the industry average.BETSOL’s open source backup and recovery product line, Zmanda (Zmanda.com), delivers up to 50% savings in total cost of ownership (TCO) and best-in-class performance.BETSOL Global IT Services (BETSOL.com) builds and supports end-to-end enterprise solutions, reducing time-to-market for its customers.BETSOL offices are set against the vibrant backdrops of Broomfield, Colorado and Bangalore, India.We take pride in being an employee-centric organization, offering comprehensive health insurance, competitive salaries, 401K, volunteer programs, and scholarship opportunities. Office amenities include a fitness center, cafe, and recreational facilities.Job DescriptionWe are seeking a skilled Mid-Level DevOps Engineer to join our dynamic team. Your focus will be on GCP data and cloud computing architecture, ensuring reliable operation and scaling of data pipelines, architecture, and tools. You will also play a crucial role in supporting the back-end integration for apps and tools we build internally.Key Responsibilities:Data Architecture:Build, maintain, and evolve a secure and reliable data architecture in Google Cloud Platform (GCP)
Establish and maintain data schema and definitions to support common understanding and a single source of truth
Implement robust QA processes to ensure rock-solid data fidelity
Maximize the use of Python over SQL to enhance flexibility and maintainabilityCloud Infrastructure Management:Design and maintain cloud-based infrastructure and resources
Utilize container orchestration tools like Kubernetes and Docker for deploying, scaling, and operating containerized applications (e.g., n8n).ETL & Data Pipeline Management:Build and maintain ETL workflows to handle data ingestion and processing using tools like Cloud Run
Implement automated pipeline QA and escalation protocols to ensure resilience and minimize downtimeVersion Control and CI/CD:Manage versioning and collaborative development of code using GitHub
Implement Continuous Integration/Continuous Deployment (CI/CD) pipelines using Google Cloud Build
Serve as the team lead / steward of best practices for version control and CI/CDData Security and Compliance:Implement best practices for data security, compliance, and access control
Collaborate with external consultants to ensure the highest standards of data security are metQualificationsQualifications:Bachelor’s degree in Computer Science, Engineering, or a related field
3+ years of experience in DevOps, cloud architecture, or related roles, ideally in a startup context where many hats needed to be worn
Proficiency in Google Cloud Platform (GCP) services, containerization tools like Kubernetes and Docker, version control systems (GitHub), and CI/CD practices
Proficient in Python and SQL
Understanding of data security best practicesSkills and Attributes:Thirst to take on ambitious projects. Someone who takes pride in building incredible things and delivering real impact
Feels a true sense of satisfaction for working in a mission-driven organization that is changes lives for the better
Strong problem-solving skills and adaptability to new challenges
Enjoys working collaboratively in a team environment
Self-motivated and eager to learn new technologies and methodologies
Detail-oriented with a focus on data accuracy and reliability

Ubicación: Puntarenas

Fecha del trabajo: Sun, 17 Nov 2024 23:49:41 GMT



Data Engineer - Cdw

Por favor, para solicitar este trabajo visita jobviewtrack.com.

Inicio Cuenta Favoritos Panel Buscar