Product Engineering Manager Jobs at Snowflake with Visa Sponsorship
Snowflake hires Product Engineering Managers to lead cross-functional teams building its cloud data platform, and it has a consistent track record of sponsoring work visas for this function. If you're targeting a technical leadership role here, the company treats sponsorship as a standard part of hiring engineering talent.
See All Product Engineering Manager at Snowflake JobsOverview
Showing 5 of 31+ Product Engineering Manager Jobs at Snowflake jobs


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?
See all 31+ Product Engineering Manager Jobs at Snowflake
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Product Engineering Manager Jobs at Snowflake.
Get Access To All Jobs
INTRODUCTION
At Snowflake, we are powering the era of the agentic enterprise. To usher in this new era, we seek AI-native thinkers across every function who are energized by the opportunity to reinvent how they work. You don’t just use tools; you possess an innate curiosity, treating AI as a high-trust collaborator that is core to how you solve problems and accelerate your impact. We look for low-ego individuals who thrive in dynamic and fast-moving environments and move with an experimental mindset — who rapidly test emerging capabilities to discover simpler, more powerful ways to deliver results. At Snowflake, your role isn't just to execute a function, but to help redefine the future of how work gets done.
ABOUT THE JOB
METADATA: Our team owns the interoperable metadata and transaction foundation for Snowflake’s Lakehouse, building the core platforms behind Snowflake Managed and Externally Managed Iceberg tables. We enable Iceberg tables to tap into the best of Snowflake—replication, sharing, change tracking/row lineage, managed storage, and cross-cloud reliability—while bringing the best of Iceberg, including branching, tagging, and rich versioning, natively into the Snowflake platform. We work at the intersection of distributed systems, open table formats, and core Snowflake services to make open-format workloads first-class citizens in the AI Data Cloud.
DATA LAKE: Catalog Services and Integrations Platform Team
We are looking for visionary leaders to join the Catalog Services & Integrations Platform team. Our mission is to make Snowflake the most interoperable data platform on the planet. We build the foundational infrastructure that enables seamless connectivity between Snowflake and the broader data ecosystem including platforms like Google BigLake, Microsoft OneLake, and Databricks Unity Catalog. If you are passionate about distributed systems, multi-cloud architectures, and shaping the future of open data standards, this is where you can have outsized impact.
What We Do
-
Catalog Integrations Platform
We design and build a high performance, scalable integration layer that allows Snowflake to interoperate with external catalogs. Our platform enables frictionless metadata data discovery and access across the modern data stack, breaking down silos and empowering true data mobility.
-
Horizon Catalog & Managed Polaris
We power the next generation of open cataloging within Snowflake through Horizon Catalog. Our team delivers managed Polaris capabilities and implements Iceberg REST Catalog (IRC) support bringing open standards, governance, and interoperability together in a unified experience.
-
Open Ecosystem Enablement
We are at the forefront of enabling open table formats and cross-platform compatibility, ensuring Snowflake remains a central player in an increasingly interconnected data landscape.
DYNAMIC TABLES
Dynamic Tables (DT) are Snowflake’s declarative engine for building automated, incremental data pipelines in simple SQL—Snowflake handles scheduling, incremental refresh, and reliable DAG execution so customers can focus on modeling their business logic. The Dynamic Tables Orchestration team builds towards the next-generation of “Unified Pipelines” that ties DTs together with Tasks, Streams, Snowpark, and dbt. We also build AI-enabled pipeline authoring and streamlining experiences, including prompt-based tuning and data refinement, turning complex platform signals into simple, intuitive workflows for data engineers.
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.
Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.
COMPENSATION
The following represents the expected range of compensation for this role:
- The estimated base salary range for this role is $236,000 - $339,250.
- Additionally, this role is eligible to participate in Snowflake’s bonus and equity plan.
The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is also eligible for a competitive benefits package that includes: medical, dental, vision, life, and disability insurance; 401(k) retirement plan; flexible spending & health savings account; at least 12 paid holidays; paid time off; parental leave; employee assistance program; and other company benefits.
To comply with pay transparency requirements and other statutes, you can notify us if you believe that a job posting is not compliant by completing this form.

INTRODUCTION
At Snowflake, we are powering the era of the agentic enterprise. To usher in this new era, we seek AI-native thinkers across every function who are energized by the opportunity to reinvent how they work. You don’t just use tools; you possess an innate curiosity, treating AI as a high-trust collaborator that is core to how you solve problems and accelerate your impact. We look for low-ego individuals who thrive in dynamic and fast-moving environments and move with an experimental mindset — who rapidly test emerging capabilities to discover simpler, more powerful ways to deliver results. At Snowflake, your role isn't just to execute a function, but to help redefine the future of how work gets done.
ABOUT THE JOB
METADATA: Our team owns the interoperable metadata and transaction foundation for Snowflake’s Lakehouse, building the core platforms behind Snowflake Managed and Externally Managed Iceberg tables. We enable Iceberg tables to tap into the best of Snowflake—replication, sharing, change tracking/row lineage, managed storage, and cross-cloud reliability—while bringing the best of Iceberg, including branching, tagging, and rich versioning, natively into the Snowflake platform. We work at the intersection of distributed systems, open table formats, and core Snowflake services to make open-format workloads first-class citizens in the AI Data Cloud.
DATA LAKE: Catalog Services and Integrations Platform Team
We are looking for visionary leaders to join the Catalog Services & Integrations Platform team. Our mission is to make Snowflake the most interoperable data platform on the planet. We build the foundational infrastructure that enables seamless connectivity between Snowflake and the broader data ecosystem including platforms like Google BigLake, Microsoft OneLake, and Databricks Unity Catalog. If you are passionate about distributed systems, multi-cloud architectures, and shaping the future of open data standards, this is where you can have outsized impact.
What We Do
-
Catalog Integrations Platform
We design and build a high performance, scalable integration layer that allows Snowflake to interoperate with external catalogs. Our platform enables frictionless metadata data discovery and access across the modern data stack, breaking down silos and empowering true data mobility.
-
Horizon Catalog & Managed Polaris
We power the next generation of open cataloging within Snowflake through Horizon Catalog. Our team delivers managed Polaris capabilities and implements Iceberg REST Catalog (IRC) support bringing open standards, governance, and interoperability together in a unified experience.
-
Open Ecosystem Enablement
We are at the forefront of enabling open table formats and cross-platform compatibility, ensuring Snowflake remains a central player in an increasingly interconnected data landscape.
DYNAMIC TABLES
Dynamic Tables (DT) are Snowflake’s declarative engine for building automated, incremental data pipelines in simple SQL—Snowflake handles scheduling, incremental refresh, and reliable DAG execution so customers can focus on modeling their business logic. The Dynamic Tables Orchestration team builds towards the next-generation of “Unified Pipelines” that ties DTs together with Tasks, Streams, Snowpark, and dbt. We also build AI-enabled pipeline authoring and streamlining experiences, including prompt-based tuning and data refinement, turning complex platform signals into simple, intuitive workflows for data engineers.
Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.
Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.
COMPENSATION
The following represents the expected range of compensation for this role:
- The estimated base salary range for this role is $236,000 - $339,250.
- Additionally, this role is eligible to participate in Snowflake’s bonus and equity plan.
The successful candidate’s starting salary will be determined based on permissible, non-discriminatory factors such as skills, experience, and geographic location. This role is also eligible for a competitive benefits package that includes: medical, dental, vision, life, and disability insurance; 401(k) retirement plan; flexible spending & health savings account; at least 12 paid holidays; paid time off; parental leave; employee assistance program; and other company benefits.
To comply with pay transparency requirements and other statutes, you can notify us if you believe that a job posting is not compliant by completing this form.
See all 31+ Product Engineering Manager at Snowflake jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Product Engineering Manager at Snowflake roles.
Get Access To All JobsTips for Finding Product Engineering Manager Jobs at Snowflake Jobs
Align your credentials to Snowflake's engineering leadership profile
Snowflake's Product Engineering Manager roles require demonstrated experience leading distributed engineering teams in cloud or data infrastructure. Before applying, document projects where you owned end-to-end product delivery, as this directly supports the specialty occupation basis for H-1B petitions.
Target teams building Snowflake's core platform products
Snowflake's highest-volume engineering hiring concentrates around its data cloud, Cortex AI, and developer platform functions. Focusing your application on these product areas signals direct relevance and puts you in front of hiring managers with active headcount rather than exploratory pipelines.
Clarify sponsorship scope before your first recruiter screen
Snowflake sponsors H-1B transfers and new cap filings, but confirm early whether your timeline aligns with the April lottery or a cap-exempt transfer. Asking directly in the recruiter screen prevents late-stage surprises when an offer is already on the table.
Use Migrate Mate to filter Product Engineering Manager openings by sponsorship history
Snowflake posts engineering manager roles across multiple business units, and not every opening carries equal sponsorship likelihood. Search the Migrate Mate job board to identify active Product Engineering Manager listings where Snowflake has a documented sponsorship pattern for the role.
Understand how Snowflake's PERM timeline affects Green Card planning
For EB-2 or EB-3 Green Card sponsorship, Snowflake must complete a DOL PERM labor certification before filing your I-140. This process typically takes 12 to 18 months from initiation, so raise long-term immigration goals with your recruiter or HR contact once you have a written offer.
Product Engineering Manager at Snowflake jobs are hiring across the US. Find yours.
Find Product Engineering Manager at Snowflake JobsFrequently Asked Questions
Does Snowflake sponsor H-1B visas for Product Engineering Managers?
Yes, Snowflake sponsors H-1B visas for Product Engineering Manager roles. The company files both new cap-subject H-1B petitions through the annual USCIS lottery and H-1B transfers for candidates already holding H-1B status with another employer. If you're mid-cycle on H-1B, a transfer means you can start work at Snowflake as soon as USCIS receives the petition, without waiting for approval.
Which visa types does Snowflake commonly use for Product Engineering Manager roles?
Snowflake sponsors H-1B, TN (for Canadian and Mexican nationals), F-1 OPT, and F-1 CPT for Product Engineering Manager positions. For candidates pursuing permanent residence, the company supports EB-2 and EB-3 Green Card pathways through DOL PERM labor certification followed by an I-140 petition. J-1 visa holders in research or training roles may also be eligible depending on the specific position.
What qualifications does Snowflake expect for a Product Engineering Manager role?
Snowflake typically expects a bachelor's degree or higher in computer science, engineering, or a closely related field, combined with several years of hands-on software engineering experience followed by team leadership. For H-1B eligibility, the role must qualify as a specialty occupation, which Snowflake's engineering management positions generally satisfy given the degree requirement embedded in the job description.
How do I apply for Product Engineering Manager jobs at Snowflake?
Applications go through Snowflake's careers site, where Product Engineering Manager roles are listed by product area. You can also find active openings filtered by sponsorship history on Migrate Mate, which makes it easier to identify roles where Snowflake has actively sponsored candidates in this function. Tailor your resume to highlight cross-functional team ownership and platform-scale product delivery to clear Snowflake's initial screening.
How do I plan my timeline if I need H-1B sponsorship to join Snowflake as a Product Engineering Manager?
The H-1B cap lottery opens each March for an October 1 start date, so a cap-subject application needs to be in place by late March of the year you intend to start. If you're currently on F-1 OPT or STEM OPT, Snowflake can employ you while your petition is pending. USCIS premium processing reduces the approval wait to roughly 15 business days once your petition is filed, which most technology employers use to reduce uncertainty.
See which Product Engineering Manager at Snowflake employers are hiring and sponsoring visas right now.
Search Product Engineering Manager at Snowflake Jobs