HL Solutions

DataOps Engineer

Hyderabad, Telengana

Skills/Tools: Airflow, dbt, Jenkins, Kubernetes, Terraform, GitOps

Job Description:

We are seeking a DataOps Engineer to join our rapidly growing company in the energy transition sector. You will play a crucial role in designing all aspects of our data operations, including scalable data pipelines, automation and testing to help grow our collaborative and cross-functional analytics. You will collaborate with cross-functional teams engaging with a variety of experts and perspectives. Our people are passionate and have a strong belief in our mission. Our efforts impact individuals and communities, all the while working toward a net-zero future. If you’re hungry for opportunity, growth, and something meaningful in a dynamic, fun and challenging environment, we’d love to hear from you.

What You’ll Do:

  • Design and build robust and scalable data infrastructure (both bath and real-time) to support the needs of product and internal stakeholders.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc.
  • Utilize your knowledge of DevSecOps to implement a secure and robust cloud-based platform.
  • Support and contribute to development guidelines and standards for data ingestion.
  • Utilize emerging DataOps concepts like ELT vs ETL, GitOps, data clouds vs data. lake/warehouse, and orchestration/automation to empower high performing data engineering team.

Who You Are:

  • You have strong experience with Snowflake for data warehousing and analytics.
  • You have proficiency in building data pipelines using Airflow.
  • You have experience with dbt and SQLMesh for data transformation and modeling.
  • You are proficient in JavaScript and Python programming languages.
  • You have hands-on experience with Kubernetes, Helm for container orchestration.
  • You have experience with FluxCD and ArgoCD for continuous deployment using GitOps practices.
  • You have a strong understanding of GitOps methodologies.
  • You have extensive experience with AWS and Azure cloud environments.
  • You have experience with Terraform for Infrastructure as Code (IaC).
  • You are familiar with Jenkins and GitHub Actions for continuous integration and delivery.
  • You have strong analytical and problem-solving skills and are excited about working in a fast-paced growth environment, embracing change and ambiguity.
  • Deep intellectual curiosity with a results-focused relentless pursuit of answers. Ability to work in a fast-paced start-up environment, embrace change and ambiguity.
  • You are excited about working in a fast-paced scale-up environment, and embracing change and ambiguity.

Recent Jobs

API Developer

Plano, TX Skills/Tools: REST APIs, JSON, Postman, Swagger, MuleSoft Job Description: The Opportunity: We have

iPaaS Specialist

Chicago, IL Skills/Tools: MuleSoft, Dell Boomi, SnapLogic, Workato Job Description: Experience: Apply Now

Integration Developer

Oak Brook, IL Skills/Tools: MuleSoft, Dell Boomi, Azure Integration Services, APIs Job Description: Complete JD:

Integration Architect

Dallas, TX Skills/Tools: MuleSoft, Dell Boomi, SnapLogic, Azure Logic Apps. Job Description: Summary: We are

Data and AI Engineering Talent

DataOps Engineer

Attach your resume. Accepted file types are JPG and PDF.