Conversational AI Engineer Jobs in USA with Visa Sponsorship
Conversational AI Engineers are in high demand from U.S. employers willing to sponsor H-1B, O-1, and EB-1 visas. The role qualifies as a specialty occupation, and LCA filings for AI engineering positions have grown sharply over the past two years. For detailed occupation requirements, see the O*NET profile.
See All Conversational AI Engineer JobsOverview
Showing 5 of 22+ Conversational AI Engineer jobs


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?
See all 22+ Conversational AI Engineer jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Conversational AI Engineer roles.
Get Access To All Jobs
Known - Conversational AI Engineer, System Prompt and Orchestration
- Location: San Francisco, CA (In-Person)
- Compensation Range: $170k-$220k Cash + Equity
Known is a matchmaker that talks to users and supports them like a friend. Our mission is to empower humanity by applying general intelligence to human connection.
Users join Known by telling us their life story. On average, our new users talk to our AI voice agent for 27 minutes, giving us a uniquely intimate multi-modal data set.
We are a team of engineers who’ve created some of the most widely used AI-driven consumer products including Uber Eats, Uber, Faire, and Afterpay.
We love to work hard, with a high degree of autonomy and ownership. We work together in Cow Hollow, San Francisco.
About the Role
We’re looking for founding Conversational AI Engineers to build the prompt systems powering our voice-led onboarding and user experiences.
This is a unique opportunity to work with a hyper-personalized data-set, combining voice transcripts, images, and structured user data to empower real-time, personalized AI voice-led conversations at scale. You’ll work directly with Chen Peng, former head of ML at Uber Eats and Faire.
What You’ll Do
- Prompt Orchestration & Context Optimization: Architecting the core system prompts and managing context windows to ensure highly responsive, contextually relevant, and logically sound AI reasoning without bloating token counts or causing latency spikes.
- EQ & Semantic Memory: Building prompt systems that allow Known to maintain a consistent, empathetic, and uniquely "Known" personality. You'll design mechanisms to seamlessly weave long-term user memories and preferences into real-time dialogue, while helping the user drive the conversation.
- Conversational Intelligence: Designing advanced prompt chains (and fallback logic) to gracefully handle conversational tangents, user interruptions, semantic end-of-turn conversation logic, and complex emotional states so Known feels empathetic and responsive.
- Agentic Workflow Design: Implementing and maintaining the prompt-driven logic for multi-agent frameworks, where your system instructions act as the routing engine between the user, external APIs, and our internal matchmaking engine.
- Evals for Conversational Quality: Developing custom evaluation frameworks to measure "conversational success." You'll go beyond basic fact-checking to rigorously assess conversational dynamism, warmth, engagement, and hallucination reduction.
Requirements
We’re looking for someone who can make automated systems feel undeniably natural:
- 2-3 Years in Conversational AI/NLP: Proven experience designing, testing, and deploying complex LLM applications and system prompts in high-traffic production environments.
- The Prompt Stack: Deep familiarity with state-of-the-art prompt engineering techniques (e.g., Few-Shot, Chain-of-Thought, ReAct).
- Agentic & RAG Architectures: Experience building the "brain" logic for LLMs using frameworks like LangGraph, LlamaIndex, or Haystack to manage complex, non-linear dialogue and dynamic knowledge retrieval.
- Production Hardened: You treat prompts as an engineering problem. You’ve optimized prompt systems for scale, API cost, and speed. You're comfortable with prompt version control, programmatic prompt optimization (e.g., DSPy), and building continuous integration pipelines for AI evals.
Our Investors
We’re backed by Eurie Kim and Kirsten Green at Forerunner Ventures (the investors behind Decagon, Faire, and Oura), NFX, and PearVC.

Known - Conversational AI Engineer, System Prompt and Orchestration
- Location: San Francisco, CA (In-Person)
- Compensation Range: $170k-$220k Cash + Equity
Known is a matchmaker that talks to users and supports them like a friend. Our mission is to empower humanity by applying general intelligence to human connection.
Users join Known by telling us their life story. On average, our new users talk to our AI voice agent for 27 minutes, giving us a uniquely intimate multi-modal data set.
We are a team of engineers who’ve created some of the most widely used AI-driven consumer products including Uber Eats, Uber, Faire, and Afterpay.
We love to work hard, with a high degree of autonomy and ownership. We work together in Cow Hollow, San Francisco.
About the Role
We’re looking for founding Conversational AI Engineers to build the prompt systems powering our voice-led onboarding and user experiences.
This is a unique opportunity to work with a hyper-personalized data-set, combining voice transcripts, images, and structured user data to empower real-time, personalized AI voice-led conversations at scale. You’ll work directly with Chen Peng, former head of ML at Uber Eats and Faire.
What You’ll Do
- Prompt Orchestration & Context Optimization: Architecting the core system prompts and managing context windows to ensure highly responsive, contextually relevant, and logically sound AI reasoning without bloating token counts or causing latency spikes.
- EQ & Semantic Memory: Building prompt systems that allow Known to maintain a consistent, empathetic, and uniquely "Known" personality. You'll design mechanisms to seamlessly weave long-term user memories and preferences into real-time dialogue, while helping the user drive the conversation.
- Conversational Intelligence: Designing advanced prompt chains (and fallback logic) to gracefully handle conversational tangents, user interruptions, semantic end-of-turn conversation logic, and complex emotional states so Known feels empathetic and responsive.
- Agentic Workflow Design: Implementing and maintaining the prompt-driven logic for multi-agent frameworks, where your system instructions act as the routing engine between the user, external APIs, and our internal matchmaking engine.
- Evals for Conversational Quality: Developing custom evaluation frameworks to measure "conversational success." You'll go beyond basic fact-checking to rigorously assess conversational dynamism, warmth, engagement, and hallucination reduction.
Requirements
We’re looking for someone who can make automated systems feel undeniably natural:
- 2-3 Years in Conversational AI/NLP: Proven experience designing, testing, and deploying complex LLM applications and system prompts in high-traffic production environments.
- The Prompt Stack: Deep familiarity with state-of-the-art prompt engineering techniques (e.g., Few-Shot, Chain-of-Thought, ReAct).
- Agentic & RAG Architectures: Experience building the "brain" logic for LLMs using frameworks like LangGraph, LlamaIndex, or Haystack to manage complex, non-linear dialogue and dynamic knowledge retrieval.
- Production Hardened: You treat prompts as an engineering problem. You’ve optimized prompt systems for scale, API cost, and speed. You're comfortable with prompt version control, programmatic prompt optimization (e.g., DSPy), and building continuous integration pipelines for AI evals.
Our Investors
We’re backed by Eurie Kim and Kirsten Green at Forerunner Ventures (the investors behind Decagon, Faire, and Oura), NFX, and PearVC.
How to Get Visa Sponsorship as a Conversational AI Engineer
Emphasize NLP and LLM depth, not breadth
Employers sponsoring visas want specialists, not generalists. Highlight concrete work with large language models, fine-tuning pipelines, or dialogue systems. Specific technical depth makes the specialty occupation case easier for immigration counsel to build.
Target companies with active LCA filings
Companies that have filed Labor Condition Applications for AI or machine learning roles in the past 12 months are far more likely to sponsor again. Search DOL disclosure data to confirm a company's sponsorship history before investing time in an application.
Frame your degree field precisely
Computer Science, Computational Linguistics, and AI or Machine Learning degrees map cleanly to this role. If your degree is in a related field like Mathematics or Cognitive Science, document how your coursework and experience directly support conversational AI work.
Get your portfolio in GitHub-shareable format
Hiring managers for AI roles assess deployed projects, not just resumes. A public repo showing a working chatbot, voice assistant, or dialogue model signals real capability and accelerates hiring decisions, which matters when H-1B cap deadlines are involved.
Understand the H-1B lottery timeline before applying
H-1B registration opens in March for an October 1 start date. If you need sponsorship by a specific date, work backward from that deadline. Missing the registration window means waiting a full year, so timing your job search accordingly is critical.
Ask directly about sponsorship in early conversations
Many AI teams want to hire but have not yet confirmed sponsorship budget or legal counsel availability. Raising it early in the recruiting process surfaces real blockers before you invest weeks in interviews and technical assessments with no viable path forward.
Conversational AI Engineer jobs are hiring across the US. Find yours.
Find Conversational AI Engineer JobsSee all 22+ Conversational AI Engineer jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Conversational AI Engineer roles.
Get Access To All JobsFrequently Asked Questions
Does a Conversational AI Engineer role qualify as a specialty occupation for H-1B purposes?
Yes. Conversational AI Engineer roles consistently qualify as specialty occupations because they require a bachelor's degree or higher in a specific technical field such as Computer Science, Computational Linguistics, or Artificial Intelligence. USCIS has approved H-1B petitions for AI and NLP engineering roles at a high rate, particularly when the job description clearly requires specialized theoretical and applied knowledge in language model development or dialogue system architecture.
What visa types are realistic for Conversational AI Engineers seeking U.S. work authorization?
H-1B is the most common path, but the annual lottery creates uncertainty. O-1A is a strong alternative for engineers with published research, patents, or recognition from the AI community. Australians can use the E-3 visa, which has no lottery and a fast approval timeline. Canadian and Mexican nationals can explore the TN visa under the Computer Systems Analyst or Engineer category, depending on role specifics.
How can I find U.S. employers willing to sponsor a visa for this role?
Migrate Mate is the most direct way to find Conversational AI Engineer roles from employers actively open to visa sponsorship. Beyond that, DOL LCA disclosure data shows which companies have filed for similar roles in recent years, giving you a verifiable list of sponsorship-willing employers to prioritize in your search.
Does my degree field need to match exactly, or can related fields qualify?
An exact match is not required, but the connection needs to be defensible. Computer Science, Machine Learning, AI, and Computational Linguistics map directly to the role. Degrees in Mathematics, Electrical Engineering, or Cognitive Science can work if your coursework or graduate specialization involved natural language processing or machine learning. The weaker the field match, the more your experience and job duties need to close the gap in the H-1B petition.
Are startups realistic sponsorship options for Conversational AI Engineers, or should I focus on large companies?
Well-funded AI startups are realistic sponsors, particularly Series B and later companies building products around LLMs or voice interfaces. The key variable is whether the company has immigration counsel on retainer and a track record of completing H-1B petitions. Early-stage startups may be willing but lack the infrastructure to execute sponsorship reliably. Ask about their legal counsel and any previous sponsored employees during the interview process.
What is the prevailing wage requirement for sponsored Conversational AI Engineer jobs?
U.S. employers sponsoring a visa must pay at least the prevailing wage, which is what workers in the same role, area, and experience level typically earn. The Department of Labor sets this rate to make sure companies aren't hiring foreign workers simply because they'd accept lower pay than a U.S. worker. It varies by job title, location, and experience. You can look up current prevailing wage rates for any occupation and location using the OFLC Wage Search page.
See which Conversational AI Engineer employers are hiring and sponsoring visas right now.
Search Conversational AI Engineer Jobs