Data Platform Engineer Jobs in USA with Visa Sponsorship
Data Platform Engineers build and maintain large-scale data infrastructure, making them strong candidates for H-1B visa sponsorship. The role typically requires a computer science or engineering degree and demonstrates the specialized technical knowledge that satisfies USCIS specialty occupation requirements. For detailed occupation requirements, see the O*NET profile.
See All Data Platform Engineer JobsOverview
Showing 5 of 528+ Data Platform Engineer jobs


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?
See all 528+ Data Platform Engineer jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Data Platform Engineer roles.
Get Access To All Jobs
Company Overview
Grounded by a history that is deeply rooted in innovation, Hexion is a global employer committed to building and protecting the future by producing innovative performance materials. Our materials are the building blocks for critical industries, including construction, agriculture, energy, automotive, and infrastructure protection. Everywhere you look, you will find our materials and people at work to help customers make products that are stronger, safer, and cleaner. When you work for Hexion, you’ll join a team that is committed to operating safely and with integrity to build a more sustainable future for all, our associates, our customers, and the communities where we live and work.
Position Overview
Hexion is seeking a senior, hands-on Azure data engineer to lead the design, implementation, and production support of enterprise data integrations. The role is centered on Azure Data Factory, Azure Logic Apps, and Databricks as well as other adjacent Azure data services to deliver governed, reliable, and maintainable pipelines. You will work closely with architecture, security, and infrastructure teams to apply the right guardrails to enable Hexion’s business objectives.
Job Responsibilities
Azure Data Engineering & Integration
- Lead the design, development, and implementation of Azure Data Factory pipelines, Azure Logic Apps workflows and related Azure data solutions to meet business requirements.
- Collaborate with business, architecture, and technical teams to translate data requirements into Azure integration deliverables.
- Develop and enforce best practices for data pipeline development, ETL processes, data quality, governance, and documentation.
- Optimize, monitor, and troubleshoot existing pipelines to improve performance, reliability, and maintainability.
- Support integrations across Azure services such as Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Power BI.
Data Platform Enablement (Azure)
- Coordinate with platform teams to onboard data workloads into approved Azure environments (subscriptions/resource groups), ensuring required standards (naming, tagging, logging) are met.
- Define repeatable environment and deployment patterns for data products (dev/test/prod), including configuration, secrets, and release boundaries for internal teams and vendors.
- Ensure prerequisite platform capabilities are available for data delivery (e.g., access to ADLS, Azure SQL, Databricks workspaces), partnering with owners to provision when needed.
- Apply governance guardrails for data workloads (policies/controls, logging, and cost visibility) and validate that deployments remain compliant over time.
- Monitor and optimize the cost/performance of data services (e.g., ADF, Databricks, storage) using tagging/chargeback practices, budget alerts, and right-sizing recommendations.
Secure Data Access, Identity & Governance
- Define least-privilege access patterns for data services (ADF, ADLS, Azure SQL, Databricks) using Entra ID, managed identities, service principals, and RBAC—then work with administering teams to implement required changes.
- Implement secure secret handling (Key Vault), encryption, and credential rotation approaches for pipelines and integrations.
- Partner with network teams to meet private connectivity requirements (private endpoints, routing, firewall rules) for data sources and targets.
- Ensure data integrations are production-ready and auditable (logging, lineage/documentation where applicable) and aligned to enterprise security and governance requirements.
Infrastructure Automation & Operations
- Implement Infrastructure-as-Code using Terraform and/or Bicep.
- Create and support CI/CD integration for both infrastructure and application or data deployments.
- Set up monitoring, logging, alerting, and basic break-fix support using Azure-native tools.
- Support disaster recovery planning and operational readiness for Azure resources and data services.
Vendor Enablement & Delivery Leadership
- Support vendor onboarding into Azure environments, including access, permissions, deployment boundaries, and operational guardrails.
- Ensure external and internal teams can deploy and operate safely without impacting core enterprise workloads.
- Work closely with Hexion leads, architects, and vendors to unblock delivery and accelerate execution.
- Mentor junior team members and help drive engineering maturity, documentation quality, and continuous improvement across the environment.
Minimum Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
- 7+ years of experience in data engineering, data integration, and production support (cloud and/or hybrid environments).
- Strong hands-on expertise with Azure data services and integration patterns (ADF, ADLS, Azure SQL, Databricks, Logic Apps).
- Proven experience designing and implementing Azure Data Factory pipelines and broader Azure data integration solutions.
- Strong understanding of ETL, data integration, data warehousing, and production support practices.
- Ability to partner with cloud platform, security, and networking teams to meet requirements for connectivity, identity, and controls needed by data workloads.
- Working knowledge of Entra ID and Azure IAM concepts (RBAC, managed identities, service principals) as they apply to securing data pipelines and services.
- Proficiency in SQL, Python, and PySpark.
- Strong troubleshooting, communication, and collaboration skills, with the ability to operate effectively in fast-moving environments.
Required Technologies:
- Azure Data & Integration: Azure Data Factory (ADF), Logic Apps, Databricks, ADLS Gen2, Azure SQL, Power BI
- Languages: SQL, Python, PySpark
- Security/Governance: Entra ID, RBAC, Managed Identity, Key Vault
- DevOps/Operations (as applicable): CI/CD, Azure DevOps/GitHub Actions, monitoring/logging, Terraform/Bicep
Preferred Qualifications
- Experience with Azure SQL Database, ADLS, Databricks, and Power BI.
- Experience with Infrastructure-as-Code (Terraform and/or Bicep) and automating deployments for data platforms.
- Exposure to AI/ML, agentic AI, or automation-heavy Azure workloads.
- Experience supporting secure multi-vendor Azure environments.
- Familiarity with Azure monitoring, logging, and disaster recovery patterns.
- Experience implementing or operationalizing controls aligned to ISO/IEC 27018 (protection of personally identifiable information in public clouds) is a plus.
- Familiarity with SAP platforms (ECC R/3, S/4HANA, BW, and Datasphere), especially in the context of integrating SAP data into Azure.
- Azure certifications such as Azure Solutions Architect Expert.
- Databricks certifications such as Data Engineer Associate or Professional.
- Databricks accreditations such as Databricks Fundamentals, Azure Platform Architect, or Platform Administrator.
- AWS experience is a plus.
Success Measures
- Azure data pipelines are designed and operating reliably in production.
- Delivery teams and vendors can release infrastructure builds safely using clear patterns for configuration, access, and deployment.
- Stakeholders receive timely, trustworthy data through integrations that meet agreed SLAs and operational expectations.
- Data quality, monitoring, and alerting are in place so issues are detected early and resolved quickly with clear ownership and runbooks.
- Documentation, repeatability, and delivery maturity improve over time, reducing friction between architecture, platform teams, and delivery teams.
Other
We are an Equal Opportunity, Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to gender, pregnancy, race, national origin, religion, age, sexual orientation, gender identity, veteran or military status, status as a qualified individual with a disability or any other characteristic protected by law.
To be considered for this position candidates are required to submit an application for employment through our career site and, be at least 18 years of age. Any offer of employment will be conditioned upon successful completion of a drug test and background investigation, as well as authorization for the Company to conduct additional periodic background checks as required by the Chemical Facility Anti-Terrorism Standards (CFATS) or regulations adopted by the department of Homeland Security or other regulatory agencies. A prior criminal record is not an automatic bar to employment, and the Company will conduct an individualized assessment and reassessment, consistent with applicable law, prior to making any final employment decision.

Company Overview
Grounded by a history that is deeply rooted in innovation, Hexion is a global employer committed to building and protecting the future by producing innovative performance materials. Our materials are the building blocks for critical industries, including construction, agriculture, energy, automotive, and infrastructure protection. Everywhere you look, you will find our materials and people at work to help customers make products that are stronger, safer, and cleaner. When you work for Hexion, you’ll join a team that is committed to operating safely and with integrity to build a more sustainable future for all, our associates, our customers, and the communities where we live and work.
Position Overview
Hexion is seeking a senior, hands-on Azure data engineer to lead the design, implementation, and production support of enterprise data integrations. The role is centered on Azure Data Factory, Azure Logic Apps, and Databricks as well as other adjacent Azure data services to deliver governed, reliable, and maintainable pipelines. You will work closely with architecture, security, and infrastructure teams to apply the right guardrails to enable Hexion’s business objectives.
Job Responsibilities
Azure Data Engineering & Integration
- Lead the design, development, and implementation of Azure Data Factory pipelines, Azure Logic Apps workflows and related Azure data solutions to meet business requirements.
- Collaborate with business, architecture, and technical teams to translate data requirements into Azure integration deliverables.
- Develop and enforce best practices for data pipeline development, ETL processes, data quality, governance, and documentation.
- Optimize, monitor, and troubleshoot existing pipelines to improve performance, reliability, and maintainability.
- Support integrations across Azure services such as Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Power BI.
Data Platform Enablement (Azure)
- Coordinate with platform teams to onboard data workloads into approved Azure environments (subscriptions/resource groups), ensuring required standards (naming, tagging, logging) are met.
- Define repeatable environment and deployment patterns for data products (dev/test/prod), including configuration, secrets, and release boundaries for internal teams and vendors.
- Ensure prerequisite platform capabilities are available for data delivery (e.g., access to ADLS, Azure SQL, Databricks workspaces), partnering with owners to provision when needed.
- Apply governance guardrails for data workloads (policies/controls, logging, and cost visibility) and validate that deployments remain compliant over time.
- Monitor and optimize the cost/performance of data services (e.g., ADF, Databricks, storage) using tagging/chargeback practices, budget alerts, and right-sizing recommendations.
Secure Data Access, Identity & Governance
- Define least-privilege access patterns for data services (ADF, ADLS, Azure SQL, Databricks) using Entra ID, managed identities, service principals, and RBAC—then work with administering teams to implement required changes.
- Implement secure secret handling (Key Vault), encryption, and credential rotation approaches for pipelines and integrations.
- Partner with network teams to meet private connectivity requirements (private endpoints, routing, firewall rules) for data sources and targets.
- Ensure data integrations are production-ready and auditable (logging, lineage/documentation where applicable) and aligned to enterprise security and governance requirements.
Infrastructure Automation & Operations
- Implement Infrastructure-as-Code using Terraform and/or Bicep.
- Create and support CI/CD integration for both infrastructure and application or data deployments.
- Set up monitoring, logging, alerting, and basic break-fix support using Azure-native tools.
- Support disaster recovery planning and operational readiness for Azure resources and data services.
Vendor Enablement & Delivery Leadership
- Support vendor onboarding into Azure environments, including access, permissions, deployment boundaries, and operational guardrails.
- Ensure external and internal teams can deploy and operate safely without impacting core enterprise workloads.
- Work closely with Hexion leads, architects, and vendors to unblock delivery and accelerate execution.
- Mentor junior team members and help drive engineering maturity, documentation quality, and continuous improvement across the environment.
Minimum Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent practical experience.
- 7+ years of experience in data engineering, data integration, and production support (cloud and/or hybrid environments).
- Strong hands-on expertise with Azure data services and integration patterns (ADF, ADLS, Azure SQL, Databricks, Logic Apps).
- Proven experience designing and implementing Azure Data Factory pipelines and broader Azure data integration solutions.
- Strong understanding of ETL, data integration, data warehousing, and production support practices.
- Ability to partner with cloud platform, security, and networking teams to meet requirements for connectivity, identity, and controls needed by data workloads.
- Working knowledge of Entra ID and Azure IAM concepts (RBAC, managed identities, service principals) as they apply to securing data pipelines and services.
- Proficiency in SQL, Python, and PySpark.
- Strong troubleshooting, communication, and collaboration skills, with the ability to operate effectively in fast-moving environments.
Required Technologies:
- Azure Data & Integration: Azure Data Factory (ADF), Logic Apps, Databricks, ADLS Gen2, Azure SQL, Power BI
- Languages: SQL, Python, PySpark
- Security/Governance: Entra ID, RBAC, Managed Identity, Key Vault
- DevOps/Operations (as applicable): CI/CD, Azure DevOps/GitHub Actions, monitoring/logging, Terraform/Bicep
Preferred Qualifications
- Experience with Azure SQL Database, ADLS, Databricks, and Power BI.
- Experience with Infrastructure-as-Code (Terraform and/or Bicep) and automating deployments for data platforms.
- Exposure to AI/ML, agentic AI, or automation-heavy Azure workloads.
- Experience supporting secure multi-vendor Azure environments.
- Familiarity with Azure monitoring, logging, and disaster recovery patterns.
- Experience implementing or operationalizing controls aligned to ISO/IEC 27018 (protection of personally identifiable information in public clouds) is a plus.
- Familiarity with SAP platforms (ECC R/3, S/4HANA, BW, and Datasphere), especially in the context of integrating SAP data into Azure.
- Azure certifications such as Azure Solutions Architect Expert.
- Databricks certifications such as Data Engineer Associate or Professional.
- Databricks accreditations such as Databricks Fundamentals, Azure Platform Architect, or Platform Administrator.
- AWS experience is a plus.
Success Measures
- Azure data pipelines are designed and operating reliably in production.
- Delivery teams and vendors can release infrastructure builds safely using clear patterns for configuration, access, and deployment.
- Stakeholders receive timely, trustworthy data through integrations that meet agreed SLAs and operational expectations.
- Data quality, monitoring, and alerting are in place so issues are detected early and resolved quickly with clear ownership and runbooks.
- Documentation, repeatability, and delivery maturity improve over time, reducing friction between architecture, platform teams, and delivery teams.
Other
We are an Equal Opportunity, Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to gender, pregnancy, race, national origin, religion, age, sexual orientation, gender identity, veteran or military status, status as a qualified individual with a disability or any other characteristic protected by law.
To be considered for this position candidates are required to submit an application for employment through our career site and, be at least 18 years of age. Any offer of employment will be conditioned upon successful completion of a drug test and background investigation, as well as authorization for the Company to conduct additional periodic background checks as required by the Chemical Facility Anti-Terrorism Standards (CFATS) or regulations adopted by the department of Homeland Security or other regulatory agencies. A prior criminal record is not an automatic bar to employment, and the Company will conduct an individualized assessment and reassessment, consistent with applicable law, prior to making any final employment decision.
See all 528+ Data Platform Engineer jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Data Platform Engineer roles.
Get Access To All JobsTips for Finding Visa Sponsorship as a Data Platform Engineer
Highlight distributed systems expertise
Emphasize experience with Kafka, Spark, Hadoop, or cloud data platforms. These specialized skills demonstrate the technical complexity that supports H-1B specialty occupation requirements.
Document your data architecture projects
Prepare detailed examples of data pipelines, ETL processes, or infrastructure you've designed. Concrete technical achievements help employers justify the specialized knowledge requirement.
Target companies with existing data teams
Look for employers already running large-scale data operations. They understand the specialized skills required and are more likely to sponsor visas for platform roles.
Emphasize your degree relevance
Connect your computer science, engineering, or mathematics degree directly to data platform work. USCIS looks for clear alignment between education and job requirements.
Research the company's data stack
Learn about their specific technologies before applying. Demonstrating knowledge of their infrastructure shows genuine interest and technical preparation for the specialized role.
Prepare for technical visa interviews
Be ready to explain your data engineering work in detail. Consular officers may ask technical questions to verify the specialized nature of your role.
Data Platform Engineer jobs are hiring across the US. Find yours.
Find Data Platform Engineer JobsFrequently Asked Questions
Do Data Platform Engineers qualify for H-1B visas?
Yes, Data Platform Engineers typically qualify for H-1B visas as the role requires specialized technical knowledge in distributed systems, data architecture, and engineering. The position usually demands a relevant bachelor's degree and demonstrates the complexity USCIS looks for in specialty occupations.
What degree do I need for Data Platform Engineer visa sponsorship?
A bachelor's degree in computer science, software engineering, data science, mathematics, or a closely related technical field is typically required. Some employers may accept equivalent combinations of education and experience, but a relevant degree strengthens your H-1B application significantly.
Which visa types work best for Data Platform Engineers?
H-1B is the most common path, with strong approval rates for technical roles. E-3 visas work for Australians, TN visas for Canadians and Mexicans under computer systems analyst classification. O-1 visas are possible for engineers with exceptional achievements in data infrastructure.
How to find Data Platform Engineer jobs with visa sponsorship?
To find Data Platform Engineer jobs with visa sponsorship, use Migrate Mate, which specializes in connecting international tech professionals with sponsoring employers. Focus your search on tech companies, financial services firms, and healthcare organizations that frequently sponsor H-1B, TN, and O-1 visas for data engineering roles. These employers actively seek candidates with cloud platforms, ETL pipeline, and big data expertise.
Do tech companies sponsor Data Platform Engineers?
Yes, major tech companies, data-driven startups, and enterprises with large-scale data operations frequently sponsor Data Platform Engineers. Companies like Amazon, Google, Netflix, and Uber regularly hire and sponsor these roles due to high demand for specialized data infrastructure skills.
How do I prove my Data Platform Engineer role is specialized?
Document your work with complex distributed systems, real-time data processing, or large-scale infrastructure projects. Highlight specific technologies like Kubernetes, Apache Airflow, or cloud platforms. Prepare technical examples that demonstrate the advanced engineering knowledge your role requires beyond basic programming.
What is the prevailing wage requirement for sponsored Data Platform Engineer jobs?
U.S. employers sponsoring a visa must pay at least the prevailing wage, which is what workers in the same role, area, and experience level typically earn. The Department of Labor sets this rate to make sure companies aren't hiring foreign workers simply because they'd accept lower pay than a U.S. worker. It varies by job title, location, and experience. You can look up current prevailing wage rates for any occupation and location using the OFLC Wage Search page.
See which Data Platform Engineer employers are hiring and sponsoring visas right now.
Search Data Platform Engineer Jobs