Kenvue 目前正在招聘 a:
我們做什麼
在 Kenvue,我們意識到日常護理的非凡力量。我們以一個多世紀的傳統為基礎,植根於科學,是標誌性品牌的品牌 - 包括您已經熟悉和喜愛的 NEUTRGENA®、AVEENO、TYLENOL®®、LISTERINE®、JOHNSON'S® 和 BAND-AID®。科學是我們的熱情所在;關心就是我們的才能。
我們是誰
我們的全球團隊由 ~ 22,000 名才華橫溢的員工組成,他們的職場文化中,每個聲音都很重要,每一個貢獻都受到讚賞。 我們熱衷於洞察, 創新並致力於為我們的客戶提供最好的產品。憑藉專業知識和同理心,成為 Kenvuer 意味著每天有能力影響數百萬人。我們以人為本,熱切關懷,以科學贏得信任,以勇氣解決——有絕佳的機會等著您!加入我們,塑造我們和您的未來。有關更多資訊,請按兩下 here.
Role reports to:
Sr. Director, Enterprise Data Products位置:
Asia Pacific, India, Karnataka, Bangalore工作地點:
完全現場你會做什麼
This position reports to the Manager, Data Engineering and is based at Bangalore, India.
Who we are
At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands - including Neutrogena, Aveeno, Tylenol, Listerine, Johnson’s and BAND-AID® Brand Adhesive Bandages that you already know and love. Science is our passion; care is our talent. Our global team is made up of ~ 22,000 diverse and brilliant people, passionate about insights, innovation and committed to delivering the best products to our customers. With expertise and empathy, being a Kenvuer means having the power to impact the life of millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours. For more information, click here.
Role reports to: Manager, Data Engineering
Location: Bangalore, India
Travel %: up to 10%
What you will do
As the Lead, Data Engineering, you will play a pivotal role in designing and prototyping cutting-edge solutions that empower our operations with timely and relevant data. You'll explore new tools and build cost-effective solutions using the powerful Snowflake and AWS Data Platform. In this position, you will also lead and inspire our cross-vendor Data Engineering team, driving key initiatives that shape the future of our organization.
Key Responsibilities
· Develop, customize, and manage integration tools, databases, warehouses, and analytical systems
· Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products
· Build and own the automation and monitoring frameworks that capture metrics and operational KPIs for data pipeline quality and performance
· Implement best practices around systems integration, security, performance and data management
· Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions
· Partner with the business, IT, and technical subject matter experts to ensure execution of enterprise-wide data engineering products & platform development
· Implement DevOps, DataOps and Agile methodologies to improve KPIs like cycle times, consistency, and quality
· Deploy data models into production environments by enriching the model with data stored in a Data Lakes or coming directly from data sources, configuring data attributes, managing computing resources, setting up monitoring tools, etc.
· Monitor the overall performance and stability of the system; adjust and adapt automated pipeline as data, models, and/or requirements change
· Work with the Data Asset and Capability teams to identify the right data sources and finalize the data architectures for optimal data extraction and transformation
· Test the reliability and performance of data engineering pipelines and support testing team with data validation activities
· Partner with various Snowflake Engineering subject matter experts including project Managers and business team to scope and build customer facing content, modules, tools and proof of concepts
· Research and cultivate in state-of-the-art data engineering methodologies, drive product innovation, and act as an Snowflake subject matter expert for other engineers
· Mentor and train colleagues junior team members
What We Are Looking For
Required Qualifications
· Bachelor's degree in engineering, computer science, or related field.
· 6 – 8 years of total work experience, with at least 4 years of experience in Snowflake Data Engineering tool stack
· At least one key data engineering professional certification (e.g., SnowPro Associate and Core)
· Experience with data management tools such as Airflow, Airbyte and DBT
· Good scripting, data modeling and programming skills
· Understanding of cloud architecture principles and best practices
· Experience in the design and build of end-to-end solutions that meet business requirements and adhere to scalability, reliability, and security standards
· Familiarity with version control systems such as GitHub, GitHub Actions and DevOps practices for CI/CD pipelines
Desired Qualifications
· Subject matter expert for Snowflake Data, Analytics & AI with experience Snowflake architecture design is preferred
· Good understanding of migration and modernization strategies and approaches is preferred
· Experience with Python, EKS Implementations, Spark, Kafka, and/or Flask is preferred
What’s in it for you
· Competitive Total Rewards Package*
· Paid Company Holidays, Paid Vacation, Volunteer Time & More!
Learning & Development Opportunities
Employee Resource Groups
This list could vary based on location/region
*Note: Total Rewards at Kenvue include salary, bonus (if applicable) and benefits. Your Talent Access Partner will be able to share more about our total rewards offerings and the specific salary range for the relevant location(s) during the recruitment & hiring process.
Kenvue is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, protected veteran status, or any other legally protected characteristic, and will not be discriminated against on the basis of disability.
如果您是殘障人士,請查看我們的 殘障人士援助頁面瞭解如何申請便利