Careers

Careers

ANavS® is hiring!

We are a team of experts for precise positioning solutions and precise mapping systems. Be part of our company with flat company structures and freedom for innovative ideas. ANavS® is located in the attractive city of Munich, Germany. The nearest underground station is only 50 m from our office, and the closest highway entrance can be reached within only 2 minutes by car. Thereby, the city center as well as the Bavarian lakes, castles and Alps can be easily reached.

If our job offerings don’t quite meet your needs, we may be able to adapt the position for you. Just ask!

Nothing for you in the following vacancy list?
No problem, we are always searching for new team members for master-thesis, working students, full-time employees.
We are looking forward to your initiative job application letter.


You can directly apply here.

Embedded Systems Software Engineer (m/f/d)

Job advertisement Id FT001 online since 12/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods.

The products have a large range of applications including automotive, robotics, automation, maritime, railway, aerospace, agriculture, and mining industries.

You will be part of the agile embedded team and contribute to high-level software components of advanced positioning systems. For our new product generation and software architecture, you will develop reliable, flexible, and scalable code for a wide range of operational scenarios. Your developments will be used in research for autonomous driving, maritime automation, autonomous robots, top-class sports and many more.

Your Core Tasks:

  • Development of management and configuration software for embedded positioning systems as ROS2 nodes
  • Development of IoT and cloud solutions
  • Development of Continuous Integration and code quality tools
  • Hardening of implementation against security and safety threats
  • Adaptation of our systems to specific customer projects
  • Development of remote maintenance mechanisms
  • Close cooperation with the hardware development and algorithmic developments
  • Patenting and publishing of developed approaches are encouraged

Core Qualifications:

These are required to get the job.

  • High motivation to become part of a growing team and to gain new skills
  • Comprehensive experience in Modern C++ and Python software development
  • Knowledge of ROS2 (or similar software frameworks) concepts
  • Good communication skills in English and German
  • Good knowledge of Linux-based operating systems, tools, and underlying functions

Additional Welcome Qualifications:

These will make you stand out between the other job candidates.

  • Experience with other common programming languages
  • Experience with Real-time operating system programming
  • Experience with safety-critical programming
  • Deep understanding of network technologies
  • Deep understanding of Linux based operating systems
  • Experience with web technologies
  • Experience with embedded systems
  • Contributions to open source projects (You’re welcome to share links to your contributions / GitHub account)

 

Embedded Software Engineer (m/f/d)

Job advertisement Id FT010 online since 04/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods. The main products of ANavS are the Multi-Sensor RTK module, the RTCM base station, and the Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a wheel odometry interface, 2 cameras, a 3D Lidar, an LTE module for the reception of RTK corrections, and a processor for the sensor fusion. The ANavS products have a large range of applications including the automotive, robotics, automation, maritime, railway, aerospace, agriculture and mining industries.

You will be part of the agile hardware team and contribute to low level software components of the systems. We employ C/C++ in most of our systems for firmware, drivers, and other real time constrained components. Also, we use C/C++ for our sensor fusion code and some internal utilities. Your job will be to develop reliable, flexible and scalable code that keeps our products operational and fault tolerant in a wide range of operational scenarios.

You will work in a small, flexible and growing team with flat hierarchies and expertise in computer vision, deep learning, software development, sensor fusion and embedded hardware development. You contribute to exciting manifold projects, for example in the automotive industry, and work together with partners such as BMW, Continental, Intel, Schaeffler and KIT.

Your Tasks:

  • Development of C/C++ code that interacts closely with the sensor fusion
  • Development of firmware and drivers for next generation positioning systems
  • Hardening of implementations against security and safety threats
  • Adaptation of our systems to specific customer projects
  • Close cooperation with the hardware, software and sensor fusion team
  • Patenting and publishing of developed approaches is encouraged

Your Qualifications:

  • Comprehensive experience in C/C++ software development
  • Computer science degree (or comparable)
  • Good knowledge of C/C, recent features of C++ (C++14 and later) APIs, smart pointer, make, cmake, multi-threaded software, networking, object orientation, software testing
  • Experience with good coding practices and design pattern
  • High motivation to contribute to the technical development, ability to work independently and willing to adapt to flexible tasks
  • Team player with quick perception, reliability and accuracy
  • Good communication skills in English and German
Sensor Fusion Engineer (m/f/d)

Job advertisement Id FT011 online since 04/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods. The main products of ANavS are the Multi-Sensor RTK module, the RTCM base station, and the Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a wheel odometry interface, 2 cameras, a 3D Lidar, an LTE module for the reception of RTK corrections, and a processor for the sensor fusion. The ANavS products have a large range of applications including the automotive, robotics, automation, maritime, railway, aerospace, agriculture and mining industries.

You will be part of our agile software development team and contribute to our tightly coupled sensor fusion framework. The core of the algorithm is written in Matlab and C/C++. Your job will be to bring/improve the complementary advantages of all our sensors into an extended Kalman filter. Especially a robust position and attitude determination with meaningful integrity-level estimation plays a major in your task.

You will work in a growing team with flat hierarchies and expertise in computer vision, deep learning, software development, sensor fusion and embedded hardware development. You contribute to exciting manifold projects, for example in the automotive industry, and work together with partners such as BMW, Continental, Intel, Schaeffler and KIT.

Your Tasks:

  • Improvement of our tightly coupled sensor fusion framework
  • Development of innovative approaches of position integrity
  • Extending the sensor fusion framework based on new sensors/chips
  • Development of Continuous Integration (CI) and code quality tools
  • Adaption of our sensor fusion framework to specific customer projects
  • Close cooperation with our embedded hardware team
  • Patenting and publishing of developed approaches is encouraged

 

Your Qualifications:

  • Comprehensive experience in Matlab and C/C++
  • Degree in electrical, geodesy or aerospace engineering (or comparable) with focus on GNSS
  • Experience in statistics and filter techniques (e.g. Kalman filter)
  • Experience with good coding practices and design pattern
  • High motivation to contribute to the technical development, ability to work independently and willing to adapt to flexible tasks
  • Team player with quick perception, reliability and accuracy
  • Good communication skills in English and German
Mobile/Web App Developer (m/f/d)

Job advertisement Id FT005 online since 04/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods. The main products of ANavS are the Multi-Sensor RTK module, the RTCM base station, and the Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a wheel odometry interface, 2 cameras, a 3D Lidar, an LTE module for the reception of RTK corrections, and a processor for the sensor fusion. The ANavS products have a large range of applications including the automotive, robotics, automation, maritime, railway, aerospace, agriculture and mining industries.

Your tasks:
  • Develop a web-based application for our sensor fusion platform
  • Define UI/UX designs for the application
  • Enhance the platform API together with the hardware department
  • Establish a deployment process for the new application
  • Provide a test concept for the application
Your profile:
  • Proficient understanding of web development by using established JavaScript frameworks like React, Angular or Vue
  • Good understanding of native mobile app development for iOS and Android
  • Experience with frameworks like React Native or Ionic would be beneficial
  • Experience with wireless technologies (TCP/IP communication)
  • Good feeling for design and user experience
  • Own projects to show your experience
  • Good communication skills in English and German
  • Team player with quick perception, reliability and accuracy
Embedded Systems Engineer (m/f/d)

Job advertisement Id FT006 online since 04/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods. The main products of ANavS are the Multi-Sensor RTK module, the RTCM base station, and the Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a wheel odometry interface, 2 cameras, a 3D Lidar, an LTE module for the reception of RTK corrections, and a processor for the sensor fusion. The ANavS products have a large range of applications including the automotive, robotics, automation, maritime, railway, aerospace, agriculture and mining industries.

You will be part of an agile hardware team and develop highly capable next generation sensor hardware. We develop our own PCBs with integrated power supply, battery management, high bandwidth wired and wireless communication, and a multitude of sensors in always evolving novel combinations. Your job will be to develop circuits and layouts, create and maintain supply chains, test and assemble prototypes.

You will work in a small, flexible and growing team with flat hierarchies and expertise in computer vision, deep learning, software development, sensor fusion and embedded hardware development. You contribute to exciting manifold projects, for example in the automotive industry, and work together with partners such as BMW, Continental, Intel, Schaeffler and KIT.

Your Tasks:

  • PCB Design
  • Spice Simulation
  • Prototype integration and evaluation
  • EM-compatibility engineering
  • CE compliance engineering
  • Close cooperation with the hardware, software and sensor fusion team
  • Patenting and publishing of developed approaches is encouraged

Your Qualifications:

  • Comprehensive experience with PCB design tools (Altium Designer or EAGLE)
  • Degree in embedded systems (or comparable)
  • Experience with Linux and wireless communication systems
  • Craftsmanship
  • High motivation to contribute to the technical development, ability to work independently and willing to adapt to flexible tasks
  • Team player with quick perception, reliability and accuracy
  • Good communication skills in English and German
Computer Vision Engineer - Focus on Camera Systems (m/f/d)

Job advertisement Id FT007 online since 04/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods. The main products of ANavS are the Multi-Sensor RTK module, the RTCM base station, and the Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a wheel odometry interface, 2 cameras, a 3D Lidar, an LTE module for the reception of RTK corrections, and a processor for the sensor fusion. The ANavS products have a large range of applications including the automotive, robotics, automation, maritime, railway, aerospace, agriculture and mining industries.

You will contribute to the integration of camera sensors into the existing Multi-GNSS/INS-based sensor fusion positioning system that already achieves centimeter-level accuracy in many scenarios. To bridge challenging GNSS scenarios such as tunnels, urban canyons or trees the fusion of camera-derived pose measurements from visual odometry or SLAM is desired. Besides the primary focus on precise positioning visual sensors are used for mapping and environment detection, which includes 3D point cloud and 2D road mapping, object detection and semantic segmentation.

Aside from computer vision another increasingly relevant topic is integrity monitoring, which may be part of your tasks. This includes monitoring of sensor data, intermediate system states and final positioning outputs to provide system status information and feedback to previous modules. GNSS spoofing detection is part of integrity monitoring in that manipulation of GNSS signals violates the system integrity. Computer vision can contribute to this task by providing alternative positioning solutions.

Your responsibility will cover the whole pipeline starting from the sensor’s integration to the algorithm development and up to the interface implementation for sensor fusion. The goal is the enhancement of our products, in particular our Integrated Sensor Platform (ISP), with real-time capable algorithms on embedded NVIDIA platforms. Accompanying this technical focus, you will take care of funded as well as customer projects that intersect with your technical expertise.

You will work in a small, flexible and growing team with flat hierarchies and expertise in computer vision, deep learning, software development, sensor fusion and embedded hardware development. You contribute to exciting manifold projects, for example in the automotive industry, and work together with partners such as BMW, Continental, Intel, Schaeffler and KIT. ANavS provides free drinks, fruits and snacks, and a kicker table, and is located in Munich Laim with direct connection to U-Bahn, Bus and S-Bahn.

 
Your tasks:
  • Development of algorithms for:
    • Visual odometry and visual SLAM, to improve current positioning performance
    • 2D/3D mapping, to generate precise maps for localization
    • Semantic segmentation and object detection, for environment detection
  • Application of state-of-the-art computer vision, machine learning and deep learning techniques
  • Potentially: Development of AI-based integrity monitoring algorithms, including machine learning based spoofing detection to provide system status and feedback
  • Implementation of interfaces between the computer vision and sensor fusion framework
  • Selection, evaluation and integration of camera sensors
  • Camera calibration and time synchronization (e.g. hardware triggering)
  • Integration of additional sensors, such as IMU, wheel odometry or GNSS-based pose measurements into computer vision approaches
  • Development of real-time solutions and docker containers for our embedded platforms
  • Responsibility for funded and customer projects
  • Bringing developed solutions towards product stage
  • Close cooperation with the hardware, software and sensor fusion team
  • Patenting and publishing of developed approaches is encouraged
Your profile:
  • Preferable 2 years’ experience in computer vision and machine learning/ deep learning
  • Computer science degree (or comparable)
  • Well-founded knowledge in SLAM techniques for localization and mapping
  • Well-founded knowledge in C++, Python and deep learning frameworks, such as PyTorch and TensorFlow.
  • Practical experience in training, developing and evaluating deep neural networks for computer vision
  • Preferable experience with camera sensors, calibration, integration and time synchronization
  • Preferable experience in ROS/ROS2, CARLA or Gazebo simulators
  • High motivation to contribute to the technical development, ability to work independently and willing to adapt to flexible tasks
  • Team player with quick perception, reliability and accuracy
  • Good communication skills in English and German
Computer Vision Engineer - Focus on Lidar and Radar Systems (m/f/d)

Job advertisement Id FT008 online since 04/2023

ANavS – Advanced Navigation Solutions has three lines of business: precise positioning systems, precise mapping systems and snow monitoring systems. The core of the ANavS positioning systems is a modular and flexibly configurable sensor fusion of GNSS, inertial, odometry, UWB, camera and Lidar measurements. The innovative positioning algorithms were developed and patented by ANavS and include newest RTK/ PPP and AI methods. The main products of ANavS are the Multi-Sensor RTK module, the RTCM base station, and the Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a wheel odometry interface, 2 cameras, a 3D Lidar, an LTE module for the reception of RTK corrections, and a processor for the sensor fusion. The ANavS products have a large range of applications including the automotive, robotics, automation, maritime, railway, aerospace, agriculture and mining industries.

You will contribute to the integration of Lidar and Radar sensors into the existing Multi-GNSS/INS-based sensor fusion positioning system that already achieves centimeter-level accuracy in many scenarios. To bridge challenging GNSS scenarios such as tunnels, urban canyons or trees the fusion of Lidar/Radar-derived pose measurements from Lidar/Radar odometry or SLAM is desired. Besides the primary focus on precise positioning visual sensors are used for mapping and environment detection, which includes 3D point cloud and 2D road mapping, object detection and semantic segmentation.

Aside from computer vision another increasingly relevant topic is integrity monitoring, which may be part of your tasks. This includes monitoring of sensor data, intermediate system states and final positioning outputs to provide system status information and feedback to previous modules. GNSS spoofing detection is part of integrity monitoring in that manipulation of GNSS signals violates the system integrity. Computer vision can contribute to this task by providing alternative positioning solutions.

Your responsibility will cover the whole pipeline starting from the sensor’s integration to the algorithm development and up to the interface implementation for sensor fusion. The goal is the enhancement of our products, in particular our Integrated Sensor Platform (ISP), with real-time capable algorithms on embedded NVIDIA platforms. Accompanying this technical focus, you will take care of funded as well as customer projects that intersect with your technical expertise.

You will work in a small, flexible and growing team with flat hierarchies and expertise in computer vision, deep learning, software development, sensor fusion and embedded hardware development. You contribute to exciting manifold projects, for example in the automotive industry, and work together with partners such as BMW, Continental, Intel, Schaeffler and KIT. ANavS provides free drinks, fruits and snacks, and a kicker table, and is located in Munich Laim with direct connection to U-Bahn, Bus and S-Bahn.

 

Your tasks:
  • Development of algorithms for:
    • Lidar/Radar odometry and SLAM, to improve current positioning performance
    • 2D/3D mapping, to generate precise maps for localization
    • Semantic segmentation and object detection, for environment detection
  • Application of state-of-the-art computer vision, machine learning and deep learning techniques
  • Potentially: Development of AI-based integrity monitoring algorithms, including machine learning based spoofing detection to provide system status and feedback
  • Implementation of interfaces between the computer vision and sensor fusion framework
  • Selection, evaluation and integration of Lidar and Radar sensors
  • Lidar/Radar sensor calibration and time synchronization (e.g. hardware triggering)
  • Integration of additional sensors, such as IMU, wheel odometry or GNSS-based pose measurements into computer vision approaches
  • Development of real-time solutions and docker containers for our embedded platforms
  • Responsibility for funded and customer projects
  • Bringing developed solutions towards product stage
  • Close cooperation with the hardware, software and sensor fusion team
  • Patenting and publishing of developed approaches is encouraged
Your profile:
  • Preferable 2 years’ experience in computer vision and machine learning/ deep learning
  • Computer science degree (or comparable)
  • Well-founded knowledge in SLAM techniques for localization and mapping
  • Well-founded knowledge in C++, Python and deep learning frameworks, such as PyTorch and TensorFlow.
  • Practical experience in training, developing and evaluating deep neural networks for computer vision
  • Preferable experience with Lidar/Radar sensors, calibration, integration and time synchronization
  • Preferable experience in ROS/ROS2, CARLA or Gazebo simulators
  • High motivation to contribute to the technical development, ability to work independently and willing to adapt to flexible tasks
  • Team player with quick perception, reliability and accuracy
  • Good communication skills in English and German
Firmware-Entwickler für die GPS/ Galileo-basierte Messung von Schneeparametern (m/f/d)

Job advertisement Id FT013 online since 06/2023

Vollzeit-Firmware-Entwickler für die GPS/ Galileo-basierte Messung von Schneeparametern – Idealer Job für alle Bergbegeisterten

Die ANavS GmbH hat ein vollkommen neues Verfahren zur GPS/ Galileo basierten Messung der Schneeparameter (Schneewassergehalt, Flüssigwassergehalt, Schneehöhe) und ein entsprechendes Messgerät entwickelt. Das System besteht aus 2 GPS/ Galileo Empfängern und einer Auswerteeinheit, die aus den GPS/ Galileo Rohdaten die vom Schnee verursachten Dämpfungen und Laufzeitverzögerungen ableitet und hieraus die Schneeparameter bestimmt. Der Anwendungsbereich ist sehr vielfältig und beinhaltet u.a. eine verbesserte Wasserabflussprognose, eine verbesserte Hochwasserprognose, ein optimierter Betrieb von Wasserkraftwerken, ein verbessertes Verständnis von Klimaveränderungen, und die Wissenschaft. Für die Unterstützung unseres Schneemessteams suchen wir einen Firmware-
Entwickler für einen unserer Standorte, wahlweise in München oder in Wattens bei Innsbruck.

Ihre Aufgaben:
Ihre Kernaufgabe ist die Weiterentwicklung der Firmware unserer Schneemessstation, um die Zuverlässigkeit der Rohdatenaufzeichnung zu steigern und damit die Vollständigkeit der Rohdaten zu verbessern. Eine sehr hohe Zuverlässigkeit der Rohdatenaufzeichnung ist sehr wichtig, da die Genauigkeit der Schneeparameter-Schätzung von der Verfügbarkeit und Vollständigkeit der GPS/ Galileo-Rohdaten abhängt, und einige Schneemessstandorte im Winter für Wartungen und Software-Updates nicht zugänglich sind. Zu Ihren Aufgaben gehört die Verbesserung der State Machine des Power Managements, und die Erweiterung der Firmware, so dass Remote Firmware-Updates möglich werden. Die Mobilfunk-Signalstärke, die Provider-Suche und Auswahl, und die Funktionsfähigkeit der Netzregistrierung sollen gemonitort werden und ein Diagnosetool für Mobilfunk, Speicherbelegung, Stromversorgung, GPS-Empfänger-Verfügbarkeit, etc. entwickelt werden. Des Weiteren soll ein Protokoll für Schnittstellen zu zusätzlichen Sensordaten, beispielsweise Spannung und Temperatur, implementiert werden. Schließlich gehört auch das systematische Testen der Firmware und deren Integration in die Schneemessstationen zu Ihrem Aufgabenbereich.

Unsere Anforderungen an Sie:
Sie haben ein Studium der Elektro- und Informationstechnik oder Informatik abgeschlossen und eine fundierte Erfahrung in der Entwicklung von C/C++, Python und Linux-Shell-Skripten, und bringen eine sehr sorgfältige und strukturierte Arbeitsweise mit. Sie sind vertraut mit einer Hardware-nahen Entwicklung; State Machines und Watchdogs kennen Sie und können Sie integrieren bzw. erweitern. Das Bug-Fixing gehört wie bei jedem Entwickler auch zu Ihrem Aufgabenbereich; über gefundene Fehler und das Einbringen von Verbesserungen können Sie sich freuen. Eine gewisse Reisebereitschaft – idealerweise mit eigenem Führerschein – wird für Stationsaufbauten bzw. -wartungen (max. 15 Tage pro Jahr) erwartet. Schließlich wäre auch noch eine gewisse Freude an Bergen und Schnee für diese unserer Meinung nach sehr interessante Aufgabe hilfreich.

Ihre Bewerbung für den Standort Wattens/innsbruck richten Sie bitte per Email an:
patrick.henkel@anavs-sensor-tech.at

Ihre Bewerbung für den Standort München richten Sie bitte per Email an:
patrick.henkel@anavs.de

 

Enhancing the ANavS Positioning System with Camera, Lidar or Radar sensors (m/f/d)

Job advertisement Id MA003 online since 04/2023

The ANavS real-time positioning system provides centimeter-accurate position information and is based on a sensor fusion of satellite signals (GNSS signals) and inertial measurements from an inertial measurement unit (IMU). In addition vehicle wheel data, barometer and range measurements can be integrated to improve the performance. Camera, Lidar and radar sensors on the one hand can be used to improve and robustify the localization performance too, for example in GNSS-denied areas, such as tunnels, and on the other hand enable the detection of the environment, which includes map creation and object detection.

Another important component to improve the safety of the positioning system is integrity monitoring, which makes sure that all sensor fusion data, such as sensor raw data or navigation solutions, are valid and consistent. This component may profit from camera, Lidar and radar sensor information as well, for example using cross-checks with other sensor or solution data. Scenarios range from simple sensor data malfunctioning to complex GNSS spoofing attacks – attacks in which satellite signals are manipulated to yield a wrong position information.

 
Your tasks:
  • Your topic will focus on one of the sensors camera, Lidar or radar with a focus on either object detection or localization and mapping depending on your background, interests and skills and our current priorities. Integrity monitoring is another possible thesis direction, which does not primarily focus on object detection or localization and mapping.
Your profile:
  • Master student in computer science, robotics, or related areas
  • Experience in computer vision, visual navigation, and preferably satellite-based navigation
  • Very good programming skills in C++ and Python and experience in software development
  • Very good analytical and debugging skills
  • Experience with deep learning and machine learning techniques and frameworks, such as PyTorch or Tensorflow
  • Theoretical knowledge and practical experience in Simultaneous Localization and Mapping (SLAM), or object detection methods would be ideal.
  • Experience with camera, Lidar or radar sensors would be ideal.
  • Strong motivation and ability to work independently
  • Team player
  • Good communication skills in English and preferably in German too
Support of Embedded Systems Team (m/f/d)

Job advertisement Id WS002 online since 04/2023

Support of the development of our embedded sensor systems for automotive and robotic applications with focus on architectural and functional software aspects.

 

Your Core Tasks:
  • Development of self-test and failsafe mechanisms
  • Hardening of implementation against security and safety threats
  • Comissioning of operating system (Linux based) for novel embedded systems
  • Work on boot process and tight Linux integration in custom hardware
Optional Tasks:
  • Driver, Firmware and further software development
  • Adaption of our systems to specific customer projects
  • Development of remote maintenance mechanisms
Core Qualifications:
  • Substantiated programming skills (C/C++)
  • Substantiated knowledge of operating systems (focus on Linux)
  • High motivation to gain new skills
Additional Welcome Qualifications:
  • Experience as software developer (C/C++ or Java)
  • Training as Computer Scientist or Electrical Engineer
  • Experience with kernel module development
  • Experience with good coding practices and design pattern
  • Familiarity with web / networking technology
  • Theoretical understanding and practical knowledge of cryptography
Support of our Computer Vision Team (m/f/d)

Job advertisement Id WS004 online since 04/2023

You will support our Computer Vision team in current tasks involving localization, mapping and object detection with camera, Lidar or radar data.

The general focus is on improving and enhancing the current GNSS/inertial-based positioning system, for example in case of degraded or missing satellite signals, or to add environment detection capabilities by map creation or object detection using camera, Lidar or radar sensors.

Your task may include all parts of the pipeline, starting at sensor evaluation, integration and calibration, collection or post-processing of datasets, development support for the data acquisition system, including the embedded processing platform (NVIDIA Jetson NX Dev. Kit) and docker containers (Linux-based) for example, evaluation of publicly available algorithms (e.g. Visual or Lidar SLAM), algorithm implementation, implementation of interfaces or tools in Python or C++. Other potential topics are related to integrity monitoring for sensor fusion or sensor fusion related implementation tasks. AI, machine learning and deep learning approaches are basis for several tasks, implementations are usually based on PyTorch or Tensorflow. Your task will be related to our product development or funded projects.

 
Your tasks:
  • Will be defined together with you depending on your background, interests and skills and our current priorities.
Your profile:
  • Student in computer science, robotics, or related areas
  • Experience in computer vision, visual navigation, and preferably satellite-based navigation
  • Very good programming skills in C++ and Python and experience in software development
  • Very good analytical and debugging skills
  • Experience with deep learning and machine learning techniques and frameworks, such as PyTorch or Tensorflow
  • Theoretical knowledge and practical experience in Simultaneous Localization and Mapping (SLAM), or object detection methods would be ideal.
  • Experience with camera, Lidar or radar sensors would be ideal.
  • Strong motivation and ability to work independently
  • Team player
  • Good communication skills in English and preferably in German too

Something sounds interesting to you?
We are looking forward to your application letter.
Go to application form