Lead Engineer
- Type:
- Posting Date:
- End Date:
- Job ID:
- 2507039342W
Share this job:
Kenvue is currently recruiting for a:
What we do
At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands - including NEUTROGENA®, AVEENO®, TYLENOL®, LISTERINE®, JOHNSON’S® and BAND-AID® that you already know and love. Science is our passion; care is our talent.
Who We Are
Our global team is ~ 22,000 brilliant people with a workplace culture where every voice matters, and every contribution is appreciated. We are passionate about insights, innovation and committed to delivering the best products to our customers. With expertise and empathy, being a Kenvuer means having the power to impact millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours. For more information, click here.
Role reports to:
Enterprise Integration LeadLocation:
Asia Pacific, India, Karnataka, BangaloreWork Location:
HybridWhat you will do
Role Overview
We are hiring a talented Platform Engineer to design, build, and maintain our cloud-based data platform. This role centers on hands-on development with Snowflake and DBT to create reliable, scalable data infrastructure that supports analytics, machine learning, and business operations. The ideal candidate will work collaboratively with data teams to implement efficient pipelines, automate processes, and ensure platform resilience while contributing to innovative data solutions.
Key Responsibilities
- Develop and optimize data pipelines using DBT for transformation, modeling, and testing within Snowflake environments.
- Architect scalable data storage and processing solutions in Snowflake, focusing on performance, security, and cost efficiency.
- Implement automation for deployment, monitoring, and maintenance of the data platform using CI/CD practices.
- Troubleshoot and resolve platform issues, including data ingestion, querying, and integration challenges.
- Collaborate with analysts, engineers, and stakeholders to refine requirements and deliver high-quality data infrastructure.
- Enforce data governance standards, including access controls, auditing, and compliance within Snowflake.
- Integrate complementary tools for orchestration, ETL, and observability to enhance platform capabilities.
- Contribute to platform evolution, incorporating feedback and new features for AI/ML support.
Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field; relevant certifications a plus.
- 6-8 years of experience in platform or data engineering, with 2+ years hands-on with Snowflake and 1+ year with DBT.
- Proven ability to work in agile teams on complex, production-grade data systems.
- Understanding of cloud infrastructure, distributed systems, and data engineering principles.
Required Skills
- Proficiency in Snowflake for warehousing, SQL optimization, and resource management.
- Strong DBT expertise for building modular, testable data models and workflows.
- Advanced SQL and programming skills (e.g., Python, Java) for scripting and automation.
- Experience with orchestration tools like Airflow, version control (Git), and cloud services (AWS, Azure).
- Knowledge of ETL/ELT processes, data security, and performance tuning.
- Problem-solving abilities with a focus on scalable, maintainable code.
- Effective collaboration and documentation skills.
Preferred Skills
- Familiarity with big data technologies like Spark or Kafka.
- Experience with infrastructure-as-code tools (e.g., Terraform) or containerization (Docker, Kubernetes).
- Certifications in Snowflake, DBT, or cloud platforms.
If you are an individual with a disability, please check our Disability Assistance page for information on how to request an accommodation.