This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Staff, ML Engineer - BEV in Canada.
In this role, you will take a leading technical position in the development of advanced Bird’s-Eye View (BEV) perception systems for autonomous driving applications. You will design and scale multi-modal machine learning models that integrate camera, LiDAR, radar, and map data into unified spatial representations of complex driving environments. Your work will directly shape the perception capabilities of next-generation autonomous trucking systems, with a strong focus on innovation, robustness, and real-world performance. You will operate at the intersection of deep learning research and large-scale production systems, driving architectural decisions and model maturity. This is a high-impact role where your contributions influence safety-critical autonomous systems deployed in real-world conditions. You will collaborate closely with cross-functional robotics and perception teams in a highly technical and research-driven environment.
Accountabilities
You will lead the design and development of BEV-based perception models that support core autonomous driving tasks such as detection, segmentation, and scene understanding. You will define the technical roadmap for multi-modal fusion architectures and drive innovation in 3D perception systems. You will also oversee large-scale training workflows, ensuring model performance, scalability, and robustness across diverse driving conditions.
- Lead the development of BEV perception models for multi-task autonomous driving applications
- Design and optimize multi-modal architectures combining camera, LiDAR, radar, and HD map data
- Develop and improve large-scale training pipelines, including data sampling, augmentation, and distributed training
- Establish evaluation frameworks for model accuracy, stability, and generalization across edge cases
- Improve robustness of perception systems under long-tail and adverse driving conditions
- Collaborate with sensor, mapping, and fusion teams to ensure system-level consistency
- Mentor ML engineers and promote best practices in experimentation, validation, and code quality
- Stay current with cutting-edge research in 3D vision, BEV modeling, and self-supervised learning
Requirements
This role requires deep expertise in machine learning for perception systems, with strong experience in BEV modeling and multi-modal sensor fusion. You should have a proven track record of developing and scaling deep learning models for 3D scene understanding, ideally in autonomous systems or robotics environments. Strong programming, leadership, and research capabilities are essential, along with the ability to work on complex distributed training systems and production-grade ML pipelines.
- 10+ years of experience in deep learning, computer vision, robotics perception, or autonomous systems
- M.S. or Ph.D. in Computer Science, Robotics, Electrical Engineering, or equivalent experience
- Strong expertise in BEV modeling, 3D vision, and multi-view geometry
- Hands-on experience with multi-modal fusion of camera, LiDAR, and radar data
- Proficiency in Python and deep learning frameworks such as PyTorch or TensorFlow
- Experience with large-scale training systems and distributed ML infrastructure
- Proven leadership in ML model development and mentoring technical teams
- Strong understanding of model performance trade-offs, including latency, memory, and accuracy considerations
- Excellent communication skills for technical collaboration across teams
- Experience with MLOps tools (e.g., Ray) or autonomous driving systems is a plus
Benefits
- Competitive compensation package including bonus and stock options
- Comprehensive medical, dental, and vision insurance for employees
- RRSP retirement plan with employer matching
- Flexible hybrid or remote work options across Canada and the United States
- Generous paid vacation and company-wide holiday closures
- Life insurance coverage
- Public transit subsidy (Montreal area, where applicable)
- Collaborative, team-oriented, and innovation-driven work culture
- Strong emphasis on work-life balance and professional growth