跳至內容
返回職業生涯

Lead Analyst

職位類別:
發佈日期:
結束日期:
ID:
2607043880W

分享此職位:

Kenvue 目前正在招聘 a:

Lead Analyst

我們做什麼

Kenvue,我們意識到日常護理的非凡力量。我們以一個多世紀的傳統為基礎,植根於科學,是標誌性品牌的品牌 - 包括您已經熟悉和喜愛的 NEUTRGENA®、AVEENO、TYLENOL®®、LISTERINE®、JOHNSON'S® 和 BAND-AID®。科學是我們的熱情所在;關心就是我們的才能。

我們是誰

我們的全球團隊由 ~ 22,000 名才華橫溢的員工組成,他們的職場文化中,每個聲音都很重要,每一個貢獻都受到讚賞。 我們熱衷於洞察, 創新並致力於為我們的客戶提供最好的產品。憑藉專業知識和同理心,成為 Kenvuer 意味著每天有能力影響數百萬人。我們以人為本,熱切關懷,以科學贏得信任,以勇氣解決——有絕佳的機會等著您!加入我們,塑造我們和您的未來。有關更多資訊,請按兩下 here.

Role reports to:

Enterprise Integration Lead

位置:

Asia Pacific, India, Karnataka, Bangalore

工作地點:

混合

你會做什麼



Role Overview

We are hiring a talented Lead Engineer to design, build, and maintain our cloud-based data integration platforms. This role centres on hands-on development with Snowflake and DBT to create reliable, scalable data infrastructure that supports analytics, machine learning, and business operations. The ideal candidate will work collaboratively with data teams to implement efficient pipelines, automate processes, and ensure platform resilience while contributing to innovative data solutions.

Key Responsibilities

  • Develop and optimize data pipelines using DBT for transformation, modelling, and testing within Snowflake environments.

  • Architect scalable data storage and processing solutions in Snowflake, focusing on performance, security, and cost efficiency.

  • Implement automation for deployment, monitoring, and maintenance of the data platform using CI/CD practices.

  • Troubleshoot and resolve platform issues, including data ingestion, querying, and integration challenges.

  • Collaborate with analysts, engineers, and stakeholders to refine requirements and deliver high-quality data infrastructure.

  • Enforce data governance standards, including access controls, auditing, and compliance within Snowflake.

  • Integrate complementary tools for orchestration, ETL, and observability to enhance platform capabilities.

  • Contribute to platform evolution, incorporating feedback and new features for AI/ML support.

Qualifications

  • Bachelor’s degree in computer science, Engineering, or a related field; relevant certifications a plus.

  • Total of 7+ years exp with 3+ years of experience in platform or data engineering, with 3+ years hands-on with Snowflake and 3+ year with DBT.

  • Proven ability to work in agile teams on complex, production-grade data systems.

  • Understanding of cloud infrastructure, distributed systems, and data engineering principles.

Required Skills

  • Proficiency in Snowflake for warehousing, SQL optimization, and resource management.

  • Strong DBT expertise for building modular, testable data models and workflows.

  • Advanced SQL and programming skills (e.g., Python, Java) for scripting and automation.

  • Experience with orchestration tools like Airflow, version control (Git), and cloud services (AWS, Azure).

  • Knowledge of ETL/ELT processes, data security, and performance tuning.

  • Problem-solving abilities with a focus on scalable, maintainable code.

  • Effective collaboration and documentation skills.

Preferred Skills

  • Familiarity with big data technologies like Spark or Kafka.

  • Experience with infrastructure-as-code tools (e.g., Terraform) or containerization (Docker, Kubernetes).

  • Certifications in Snowflake, DBT, or cloud platforms.

如果您是殘障人士,請查看我們的 殘障人士援助頁面瞭解如何申請便利