E-3 Visa Conversational AI Engineer Jobs
Conversational AI Engineer roles qualify for E-3 visa sponsorship as specialty occupations requiring a relevant bachelor's degree or higher. Australian citizens can apply year-round with no lottery, securing two-year work authorisation that renews indefinitely while you build NLP, dialogue systems, and LLM-based products for U.S. employers.
See All Conversational AI Engineer JobsOverview
Showing 5 of 8+ Conversational AI Engineer jobs


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?


Have you applied for this role?
See all 8+ Conversational AI Engineer jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Conversational AI Engineer roles.
Get Access To All Jobs
Known - Conversational AI Engineer, System Prompt and Orchestration
- Location: San Francisco, CA (In-Person)
- Compensation Range: $170k-$220k Cash + Equity
Known is a matchmaker that talks to users and supports them like a friend. Our mission is to empower humanity by applying general intelligence to human connection.
Users join Known by telling us their life story. On average, our new users talk to our AI voice agent for 27 minutes, giving us a uniquely intimate multi-modal data set.
We are a team of engineers who’ve created some of the most widely used AI-driven consumer products including Uber Eats, Uber, Faire, and Afterpay.
We love to work hard, with a high degree of autonomy and ownership. We work together in Cow Hollow, San Francisco.
About the Role
We’re looking for founding Conversational AI Engineers to build the prompt systems powering our voice-led onboarding and user experiences.
This is a unique opportunity to work with a hyper-personalized data-set, combining voice transcripts, images, and structured user data to empower real-time, personalized AI voice-led conversations at scale. You’ll work directly with Chen Peng, former head of ML at Uber Eats and Faire.
What You’ll Do
- Prompt Orchestration & Context Optimization: Architecting the core system prompts and managing context windows to ensure highly responsive, contextually relevant, and logically sound AI reasoning without bloating token counts or causing latency spikes.
- EQ & Semantic Memory: Building prompt systems that allow Known to maintain a consistent, empathetic, and uniquely "Known" personality. You'll design mechanisms to seamlessly weave long-term user memories and preferences into real-time dialogue, while helping the user drive the conversation.
- Conversational Intelligence: Designing advanced prompt chains (and fallback logic) to gracefully handle conversational tangents, user interruptions, semantic end-of-turn conversation logic, and complex emotional states so Known feels empathetic and responsive.
- Agentic Workflow Design: Implementing and maintaining the prompt-driven logic for multi-agent frameworks, where your system instructions act as the routing engine between the user, external APIs, and our internal matchmaking engine.
- Evals for Conversational Quality: Developing custom evaluation frameworks to measure "conversational success." You'll go beyond basic fact-checking to rigorously assess conversational dynamism, warmth, engagement, and hallucination reduction.
Requirements
We’re looking for someone who can make automated systems feel undeniably natural:
- 2-3 Years in Conversational AI/NLP: Proven experience designing, testing, and deploying complex LLM applications and system prompts in high-traffic production environments.
- The Prompt Stack: Deep familiarity with state-of-the-art prompt engineering techniques (e.g., Few-Shot, Chain-of-Thought, ReAct).
- Agentic & RAG Architectures: Experience building the "brain" logic for LLMs using frameworks like LangGraph, LlamaIndex, or Haystack to manage complex, non-linear dialogue and dynamic knowledge retrieval.
- Production Hardened: You treat prompts as an engineering problem. You’ve optimized prompt systems for scale, API cost, and speed. You're comfortable with prompt version control, programmatic prompt optimization (e.g., DSPy), and building continuous integration pipelines for AI evals.
Our Investors
We’re backed by Eurie Kim and Kirsten Green at Forerunner Ventures (the investors behind Decagon, Faire, and Oura), NFX, and PearVC.

Known - Conversational AI Engineer, System Prompt and Orchestration
- Location: San Francisco, CA (In-Person)
- Compensation Range: $170k-$220k Cash + Equity
Known is a matchmaker that talks to users and supports them like a friend. Our mission is to empower humanity by applying general intelligence to human connection.
Users join Known by telling us their life story. On average, our new users talk to our AI voice agent for 27 minutes, giving us a uniquely intimate multi-modal data set.
We are a team of engineers who’ve created some of the most widely used AI-driven consumer products including Uber Eats, Uber, Faire, and Afterpay.
We love to work hard, with a high degree of autonomy and ownership. We work together in Cow Hollow, San Francisco.
About the Role
We’re looking for founding Conversational AI Engineers to build the prompt systems powering our voice-led onboarding and user experiences.
This is a unique opportunity to work with a hyper-personalized data-set, combining voice transcripts, images, and structured user data to empower real-time, personalized AI voice-led conversations at scale. You’ll work directly with Chen Peng, former head of ML at Uber Eats and Faire.
What You’ll Do
- Prompt Orchestration & Context Optimization: Architecting the core system prompts and managing context windows to ensure highly responsive, contextually relevant, and logically sound AI reasoning without bloating token counts or causing latency spikes.
- EQ & Semantic Memory: Building prompt systems that allow Known to maintain a consistent, empathetic, and uniquely "Known" personality. You'll design mechanisms to seamlessly weave long-term user memories and preferences into real-time dialogue, while helping the user drive the conversation.
- Conversational Intelligence: Designing advanced prompt chains (and fallback logic) to gracefully handle conversational tangents, user interruptions, semantic end-of-turn conversation logic, and complex emotional states so Known feels empathetic and responsive.
- Agentic Workflow Design: Implementing and maintaining the prompt-driven logic for multi-agent frameworks, where your system instructions act as the routing engine between the user, external APIs, and our internal matchmaking engine.
- Evals for Conversational Quality: Developing custom evaluation frameworks to measure "conversational success." You'll go beyond basic fact-checking to rigorously assess conversational dynamism, warmth, engagement, and hallucination reduction.
Requirements
We’re looking for someone who can make automated systems feel undeniably natural:
- 2-3 Years in Conversational AI/NLP: Proven experience designing, testing, and deploying complex LLM applications and system prompts in high-traffic production environments.
- The Prompt Stack: Deep familiarity with state-of-the-art prompt engineering techniques (e.g., Few-Shot, Chain-of-Thought, ReAct).
- Agentic & RAG Architectures: Experience building the "brain" logic for LLMs using frameworks like LangGraph, LlamaIndex, or Haystack to manage complex, non-linear dialogue and dynamic knowledge retrieval.
- Production Hardened: You treat prompts as an engineering problem. You’ve optimized prompt systems for scale, API cost, and speed. You're comfortable with prompt version control, programmatic prompt optimization (e.g., DSPy), and building continuous integration pipelines for AI evals.
Our Investors
We’re backed by Eurie Kim and Kirsten Green at Forerunner Ventures (the investors behind Decagon, Faire, and Oura), NFX, and PearVC.
See all 8+ Conversational AI Engineer jobs
Sign up for free to unlock all listings, filter by visa type, and get alerts for new Conversational AI Engineer roles.
Get Access To All JobsTips for Finding E-3 Visa Sponsorship as a Conversational AI Engineer
Frame your degree for specialty occupation
Consular officers assess whether your degree field directly supports the Conversational AI Engineer role. A computer science, linguistics, or AI-specific degree aligns cleanly. A general IT degree needs supporting coursework evidence to satisfy the specialty occupation standard.
Target employers with LCA filing history
Search the DOL's Office of Foreign Labor Certification disclosure data for employers who have previously filed LCAs for AI, NLP, or machine learning titles. Prior LCA activity confirms an employer understands the E-3 sponsorship process and won't stall at the paperwork stage.
Use Migrate Mate to run your job search
Migrate Mate filters roles by E-3 sponsorship eligibility, so you're not cold-applying to employers unfamiliar with the visa. Search specifically for Conversational AI Engineer and related NLP or dialogue systems titles to surface verified sponsoring employers.
Negotiate the offer before LCA certification
Your employer files the LCA with the DOL before your visa interview, and the certified wage locks in your salary floor. Finalise your compensation terms before that filing, not after, since amending a certified LCA requires starting the process again.
Clarify employment structure at the offer stage
Conversational AI Engineers placed at third-party client sites through a staffing arrangement face additional LCA documentation requirements. Confirm directly whether you'll be employed by the end client or a vendor, since the answer changes who files your E-3 paperwork.
Streamline filing with end-to-end support
Once you have a signed offer, use Migrate Mate's E-3 filing service to handle your LCA and visa paperwork. This is where sponsored candidates most often lose time: missed DOL wage levels, incomplete consulate packets, or misclassified SOC codes on AI engineering roles.
Conversational AI Engineer jobs are hiring across the US. Find yours.
Find Conversational AI Engineer JobsConversational AI Engineer E-3 Visa: Frequently Asked Questions
How do I find Conversational AI Engineer jobs that offer E-3 visa sponsorship?
Migrate Mate is built specifically for this search. It surfaces Conversational AI Engineer roles from employers with active E-3 and LCA filing history, so you can focus on companies already familiar with the process. Standard job platforms don't filter by visa sponsorship type, which means most results include employers who can't or won't sponsor an E-3.
How much does it cost to get an E-3 visa?
Migrate Mate's E-3 filing service covers the entire process for $499, including the Labor Condition Application, visa document preparation, and consulate appointment guidance. Traditional immigration lawyers charge $2,000–$5,000+ for the same work. The E-3 has less paperwork than most work visas, so paying thousands for legal help is usually unnecessary.
Does a Conversational AI Engineer role qualify as a specialty occupation for the E-3?
Yes. Conversational AI Engineering meets the specialty occupation standard because it normally requires a bachelor's degree or higher in a specific field such as computer science, computational linguistics, or artificial intelligence. Roles centred on NLP model development, dialogue system architecture, or LLM fine-tuning have a clear and direct degree requirement, which is the core test USCIS and consular officers apply.
How does the E-3 compare to the H-1B for Conversational AI Engineer roles?
Both require specialty occupation status, but the practical difference is significant for Australians. The H-1B is subject to an annual lottery with roughly a one-in-five selection rate, while the E-3 has a 10,500-per-year allocation that has never been exhausted. You can apply for an E-3 any time you have a qualifying offer, without waiting for a lottery window or an October 1 start date.
Can I switch employers after entering the U.S. on an E-3 as a Conversational AI Engineer?
Yes, but the new employer must file a fresh LCA and you'll need a new E-3 visa or an amended status before you start work. Unlike H-1B portability rules, the E-3 doesn't allow you to begin working for a new employer on a pending petition. Build enough runway into your transition timeline to complete the new LCA certification, which DOL targets within seven business days.
See which Conversational AI Engineer employers are hiring and sponsoring visas right now.
Search Conversational AI Engineer Jobs