Who We Are
Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series D funding round brought our total investment to over $320 million, fueling our ambitious vision.
Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totalling over 9 million brokerage accounts.
Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet. We're deeply committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it.
Alpaca is proudly backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator.
Our Team Members
We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond!
We're searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply.
Your Role
We are looking for a Head of Data to lead Alpaca’s ~15-person data department across Platform Engineering, Analytics Engineering, and Data Science. You will own the data strategy, the team’s execution, and the department’s roadmap.
The Data department serves every function at Alpaca: it powers partner invoicing and revenue attribution, provides the analytical foundation for sales, product, and compliance, and operates the data platform that processes hundreds of millions of events daily. You will manage three team leads, balancing operational delivery (invoicing, embedded analytics, regulatory reporting) with strategic bets (self-service warehouse, AI-powered analytics, enterprise search).
This is a player-coach role. You will set direction for the team while staying close enough to the technical details to make architecture decisions, unblock your leads, and represent data’s capabilities to the executive team.
Things You Get To Do:
- Lead and develop three sub-teams: Platform Engineering & ETL, Analytics Engineering, and Data Science & Analytics. Manage leads, set priorities, and ensure delivery.
- Own the Data Lakehouse architecture: Trino, Iceberg/GCS, Airflow, Airbyte, Redpanda CDC, dbt. Make build-vs-buy decisions on tooling.
- Drive partner invoicing accuracy and evolution: ensure invoicing logic is versioned, reproducible, and scales with new pricing mechanisms and product launches.
- Deliver embedded analytics: expose warehouse data to partners via BrokerDash, SSR pipelines, and API-based reporting. Own row-level security and entitlements.
- Support product launches with data change management: coordinate data impact analysis for new products (fixed income, global stocks, perps, 24/5 trading) across downstream datasets, dashboards, and reverse ETL.
- Accelerate self-service: move the organization toward self-serve analytics via semantic layers, data catalogues, and conversational BI so the data team can shift from ad-hoc queries to strategic projects.
- Guide AI/ML enablement: oversee enterprise AI search, agent-based workflow automation, and LLM-powered analytics. Help balance vendor solutions with in-house development.
- Collaborate with Finance, Sales, Product, Compliance, and Customer Success to translate business needs into data products.
- Manage infrastructure costs: keep data + cloud cost ratio under target as AUC grows.
- Operate production systems: own on-call processes, incident response, and SLOs for data freshness, accuracy, and availability.
Who You Are (Must-Haves)
- 8+ years in data engineering or analytics, including 3+ years managing data teams (leads + ICs).
- Deep experience with modern data stack: dbt, Trino/Presto or equivalent query engines, Apache Iceberg or similar table formats, cloud object storage.
- Hands-on experience with ETL/ELT patterns at scale: CDC (Debezium/Kafka), batch (Airflow/dbt), streaming, and reverse ETL.
- Track record of building self-service analytics capabilities for non-technical stakeholders.
- Experience with financial data: trading, invoicing, revenue attribution, or regulatory reporting in fintech or financial services.
- Proficiency in Python and SQL. Comfortable reading code, reviewing PRs, and making architecture decisions.
- Experience managing distributed/remote teams across multiple time zones.
- Strong stakeholder management: you can translate between executive priorities and engineering execution.
- Experience with GCP (GKE, GCS, BigQuery migration), Kubernetes, Helm, Terraform.
Who You Might Be (Nice-to-Haves)
- Experience with brokerage or broker-dealer operations (clearing, settlement, market making, reconciliation).
- Familiarity with LLM/AI tooling: MCP, vector databases, enterprise search, conversational BI (WrenAI, Cube).
- Background in compliance analytics (AML/Actimize, KYC, margin calls).
- Exposure to open-source data catalogues (OpenMetadata, Collate) and data quality frameworks.
How We Take Care of You:
- Competitive Salary & Stock Options
- Health Benefits
- New Hire Home-Office Setup: One-time USD $500
- Monthly Stipend: USD $150 per month via a Brex Card
Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.
Recruitment Privacy Policy