Senior Data/AI Engineer
Calling All Upstarters!
SENIOR DATA/AI ENGINEER WANTED!
We are Upstart 13. We are humble, hungry, and competent people who are radically changing the expectations and experience of outsourcing for all participants by challenging barriers that create inequality and by bringing down borders in technology for people everywhere. We’re all about delivering value and doing big things. We have become a game changer for teams around the world who look to Upstart’s services as a differentiator.
Job Description:
We are seeking a highly skilled Senior Engineer in Latin America with strong experience in data engineering and AI‑driven applications. The ideal candidate has a proven track record of building scalable backend systems, delivering production data pipelines, and implementing AI‑powered workflows, with a deep understanding of how systems operate end‑to‑end.
This role is deeply technical and hands‑on, focused on designing, developing, and operating services that power intelligent, data‑driven features—including agentic workflows, retrieval‑augmented generation (RAG), and automated assistants—while ensuring that the underlying data foundations remain reliable, efficient, secure, and well‑architected.
Beyond execution, this role requires strong technical judgment: the ability to reason about trade‑offs, guide implementation decisions, and help the team navigate uncertainty around architecture, tooling, and emerging AI patterns. You will collaborate closely with the Solution Architect and cross‑functional teams to deliver scalable, secure, and production‑ready systems across diverse technology stacks.
Responsibilities
AI‑Driven Backend Development
Design and implement AI‑powered workflows such as agents, assistants, and tool‑enabled services.
Build and maintain backend services using modern web frameworks and server‑side languages.
Implement RAG‑style retrieval systems and contribute to prompt engineering and model interaction patterns.
Integrate LLMs into production environments and develop scalable service layers around model inference and orchestration.
Work with vector search technologies and embedding pipelines where needed.
Provide technical guidance on AI workflow design, patterns, and trade‑offs, helping the team make sound implementation decisions.
Data Engineering & Analytics Enablement
Design and operate ETL/ELT pipelines for batch, streaming, or hybrid workloads.
Model data for analytics and application needs using best practices in schema design and storage optimization.
Implement data quality, validation, auditing, and observability mechanisms.
Ensure secure, reliable connectivity across cloud, on‑premise, and hybrid environments.
Collaborate to define semantic business logic, reusable metrics, and shared data assets.
Help define and evolve data architecture standards to ensure long‑term scalability and clarity.
API & Integration Development
Design and implement scalable APIs (REST, queues, events).
Build integrations that connect AI systems to external applications, services, or platforms.
Implement secure patterns for authentication, authorization, and secrets management.
Ensure the overall backend architecture remains modular, maintainable, and extensible.
Identify and resolve architectural inconsistencies or technical debt that could impact system evolution.
Database & Performance Optimization
Write and optimize complex queries across relational and NoSQL databases.
Design efficient schemas, indexing strategies, and caching solutions to support high‑performance AI and data workloads.
Conduct load and performance testing and implement safeguards to meet reliability targets.
Manage and evolve data lake structures to support analytics and AI use cases.
Contribute to data governance and data security best practices.
Proactively surface performance, scalability, or reliability risks and propose mitigation strategies.
Scalability, Security & Reliability
Architect and implement systems for scalability, reliability, and security across backend and data layers.
Add and maintain observability (logs, metrics, traces), alerting, and runbooks.
Contribute to CI/CD, automation, and environment‑based deployments.
Ensure compliance with relevant privacy and security standards (PII, HIPAA, SOC 2 Type 2, or similar).
Support incident analysis and post‑mortems by driving systems‑level understanding and improvements.
Cross‑Functional Collaboration
Work with AI researchers to produce models and evaluate performance.
Collaborate closely with the Solution Architect to align designs with long‑term architectural direction.
Partner with Product to translate AI and data capabilities into meaningful user features.
Help unblock technical ambiguity by clarifying options, constraints, and trade‑offs.
Stay informed on emerging AI, backend, and data engineering technologies and share insights with the team.
Qualifications
Technical skills:
7+ years of experience in data engineering with exposure to backend engineering and AI‑powered systems.
Proven experience delivering production‑grade services, APIs, and data pipelines end‑to‑end.
Strong knowledge of one programming language used for data pipelines or backend systems (Python, R, Java, Go, C#).
Hands‑on experience integrating or building systems powered by large language models (LLMs) or similar AI technologies.
Experience implementing RAG pipelines, vector search, or agentic automation frameworks (any stack).
Familiarity with AI agent frameworks (e.g., CrewAI, OpenAI, LangChain).
Strong experience using Jupyter Notebooks for experimentation, validation, and analysis.
Strong understanding of API design, service architecture, and secure integration patterns.
Solid SQL skills and strong grounding in data modeling, schema optimization, and performance tuning.
Familiarity with CI/CD pipelines, infrastructure automation, and cloud environments (any provider).
Ability to reason about systems holistically, from data ingestion through AI inference to production operations.
Soft skills:
Strong ownership and accountability; able to operate independently in fast‑moving environments.
Pragmatic, iterative mindset aligned with product delivery and continuous improvement.
Excellent collaboration skills across engineering, product, and AI/data teams.
Ability to guide technical decisions and support teammates when navigating uncertainty.
Continuous learning mentality, especially with rapidly evolving AI tooling.
Clear, concise communication of technical decisions, trade‑offs, and risks.
Bonus skills:
Experience with vector databases, orchestration tools, or background workers/queues.
Familiarity with cloud providers (AWS, Azure, GCP) and containerized environments.
Experience with streaming technologies, event‑driven patterns, or data lake architectures.
Exposure to performance testing, observability frameworks, or model evaluation/monitoring tools.
Familiarity with Medallion Architecture.
Understanding of emerging ecosystem tools such as Model Context Protocol (MCP).
Why Upstart13?
We put people first at Upstart 13! We believe the world is filled with amazing people and we are willing to go to great lengths to seek out others who share our values to join our cause of bringing down borders in technology for people everywhere.
We develop leaders at Upstart 13, we focus on what matters to do meaningful work, we own our shit, we stay curious, and we understand responsibility leads to giving. We do big things together!
Perks:
Job-type: long-term, full-time job.
Fully remote.
USD competitive salary.
20+ Paid time off days.

Are you ready to join our cause? Be sure to ask, “why 13?”
- Department
- Data
- Remote status
- Fully Remote
- Employment type
- Full-time
Colleagues
About Upstart 13
We strategize, solve, and build solutions to business problems with AI, data, and software—grounded in strategic clarity.
From boardroom to build, we connect strategy to execution using all available intelligence—human and otherwise—to help companies achieve efficiency, growth, and competitive advantage.