Kenvue 目前正在招聘 a:
我们做什么
在 Kenvue,我们意识到日常护理的非凡力量。我们以一个多世纪的传统为基础,植根于科学,是标志性品牌的品牌 - 包括您已经熟悉和喜爱的 NEUTRGENA®、AVEENO、TYLENOL®®、LISTERINE®、JOHNSON'S® 和 BAND-AID®。科学是我们的热情所在;关心就是我们的才能。
我们是谁
我们的全球团队由 ~ 22,000 名才华横溢的员工组成,他们的职场文化中,每个声音都很重要,每一个贡献都受到赞赏。 我们热衷于洞察, 创新并致力于为我们的客户提供最好的产品。凭借专业知识和同理心,成为 Kenvuer 意味着每天有能力影响数百万人。我们以人为本,热切关怀,以科学赢得信任,以勇气解决——有绝佳的机会等着您!加入我们,塑造我们和您的未来。有关更多信息,请单击 here.
Role reports to:
Enterprise Integration Lead位置:
Asia Pacific, India, Karnataka, Bangalore工作地点:
混合你会做什么
Role Overview
We are hiring a talented Lead Engineer to design, build, and maintain our cloud-based data integration platforms. This role centres on hands-on development with Snowflake and DBT to create reliable, scalable data infrastructure that supports analytics, machine learning, and business operations. The ideal candidate will work collaboratively with data teams to implement efficient pipelines, automate processes, and ensure platform resilience while contributing to innovative data solutions.
Key Responsibilities
Develop and optimize data pipelines using DBT for transformation, modelling, and testing within Snowflake environments.
Architect scalable data storage and processing solutions in Snowflake, focusing on performance, security, and cost efficiency.
Implement automation for deployment, monitoring, and maintenance of the data platform using CI/CD practices.
Troubleshoot and resolve platform issues, including data ingestion, querying, and integration challenges.
Collaborate with analysts, engineers, and stakeholders to refine requirements and deliver high-quality data infrastructure.
Enforce data governance standards, including access controls, auditing, and compliance within Snowflake.
Integrate complementary tools for orchestration, ETL, and observability to enhance platform capabilities.
Contribute to platform evolution, incorporating feedback and new features for AI/ML support.
Qualifications
Bachelor’s degree in computer science, Engineering, or a related field; relevant certifications a plus.
Total of 7+ years exp with 3+ years of experience in platform or data engineering, with 3+ years hands-on with Snowflake and 3+ year with DBT.
Proven ability to work in agile teams on complex, production-grade data systems.
Understanding of cloud infrastructure, distributed systems, and data engineering principles.
Required Skills
Proficiency in Snowflake for warehousing, SQL optimization, and resource management.
Strong DBT expertise for building modular, testable data models and workflows.
Advanced SQL and programming skills (e.g., Python, Java) for scripting and automation.
Experience with orchestration tools like Airflow, version control (Git), and cloud services (AWS, Azure).
Knowledge of ETL/ELT processes, data security, and performance tuning.
Problem-solving abilities with a focus on scalable, maintainable code.
Effective collaboration and documentation skills.
Preferred Skills
Familiarity with big data technologies like Spark or Kafka.
Experience with infrastructure-as-code tools (e.g., Terraform) or containerization (Docker, Kubernetes).
Certifications in Snowflake, DBT, or cloud platforms.
如果您是残障人士,请查看我们的 残障人士援助页面了解如何申请便利