도심 환경에서 배달을 수행할 수 있는 자율주행 로봇을 만드는 프로젝트입니다. 복잡한 환경에서도 안정적으로 주행하기 위해 지형 정보를 이용하는 node-link 개념을 활용하여 로봇이 목적지까지 안전하고 빠르게 도달할 수 있습니다. 주변 환경을 인식하는 다양한 센서의 조합을 통해 도심 환경에서도 강건한 항법 알고리즘을 개발하고 있습니다.
Hyperparameters significantly affect the intelligent robotic system performance. However, the explicit relationship between them is not known (Blackbox). The system evaluation can be done only by sampling, which is usually expensive. The objective is to find the systematic way to ease the system optimization.
라이다를 이용한 Detection, Tracking and Prediction
- 딥러닝을 이용한 물체 인식, 추종(tracking-by-detection)
- Convolutional GRU를 이용해 타 차량의 궤적 예측
생산 현장에서 적용 가능한 자율주행 로봇의 알고리즘 개발 및 적용
도로 환경의 주변 전력장치 검진 시스템 개발
- 차량 시스템을 위한 전력 시스템 설계
- 센서 시스템 개발
- 시스템 Monitoring을 위한 HMI 개발
전자과 내 인공지능 연구실과 함께 클러스터로 구성되어 자율주행 구현을 담당, 도심 자율 주행을 위한 인공지능 기술 기반 알고리즘 개발 및 자율 주행 시스템 개발
While there are impending explosive demands, in order to safely integrate unmanned aerial vehicles into civil airspace,
- Prepare critical infrastructure for basic operation of civil RPAS
- Collect flight data of RPAS under various experiment conditions using the aforementioned infrastructure,
- Analyze the collected data to draft flight safety regulations, certification process, and operation procedures
By executing the procedures listed above, our nation can be prepared for the impending era of civil RPAS and participate in the ICAO RPAS rule making process as a leading country based on the research results generated from the domestic RPAS operation.
This project is funding by Ministry of Land, Infrastructure, and Transport from 2016.
The objective of this project is that we develop the smart drone.
Smart drone can be used in a variety of fields related with safety and convenient service by using ICT technology.
Especially, Our research lab develop the vision-based automatic landing, fault tolerant controller algorithms for the rotary UAV.
This project is funding by Ministry of Trade, Industry and Energy from 2016.
The objective of this project is establishment of the USV(Unmanned Surface Vehicle) certification system and regulation for the operation of USVs in Korean waters and development of cooperative technology for multiple heterogeneous unmanned vehicles(USV and UAV) and technology for their maritime application services
This project is funded by the Ministry of Oceans and Fisheries from 2015.
- Vision-based aircraft detection using deep learning
This project aims to develop airborne collision avoidance system and related technology based on the performance of manned aircraft for civil aviation airspace integration operation. In this study, a remote aircraft is detected using an image sensor. Deep learning technology enables detection of flight objects in cluttered backgrounds and has a fast processing speed that can be used in embedded computers.
This research is funded by Uconsystem and the Ministry of Trade, Industry & Energy from 2015.
Autonomous driving and artificial intelligence technologies are the leading fields of the 4th industrial revolution.
In this project, our consortium aims to develop the EV based open autonomous vehicle platform that allows access to the essential technologies for autonomous driving such as perception, planning and control.
Among them, our research team is performing autonomous driving control based on End-to-End deep learningwhich predicts optimal control command from sensing data as one neural network.
This project is funding by Ministry of Trade, Industry and Energy from 2017.
Recently, drone market increased rapidly according to the growing popularity of drones. However, as the usage of drones increased, the people who take advantage of the drone increased as well, from minor violation like picturing the private life with the camera on it, to a special crime like delivering drugs.
This research aims to develop the vision-based UAV to track and capture the target, which is the drone attacking to our camp. To achieve the objective, our research team integrate the autonomous flight system and the on-board image-processing system on the one UAV platform.
The purpose of this research is development of a humanoid robot acts as the pilot for converting existing aircraft or car into unmanned aircraft or unmanned car with minimal modifications. In contrast to current trend, such as OPV technology and auto-pilot, this proposed method could give full-authorization to pilot robot and decrease altered parts and the required time. The pilot robot hardware is composed of four 6-DOF manipulators(two for arms and two for legs), two adapters for both hands, body frame, vison sensor. The software architecture is designed to automate flight operation from take-off to landing. The core of software is feedback control of aircraft by manipulating cockpit components based on the flight states received from flight simulator computer and waypoint planning.
The purpose of this research is development of the autonomous robot which mutually supplements both issues of the UGVs and the UAVs by convergence of both systems. Unmanned ground robots have demerits about relatively short mission range and getting over the obstacles, and UAVs have issues of noise problem and lower energy efficiency. The developed robot can achieve various ground/aerial mixed missions which cannot be achieved by existing unmanned robots.
The UHV has ALFUS autonomy level 4, then it decides actions throughout a mission and notices to a user. It will be basically used for the surveillance and reconnaissance, or measurements of the radioactivity, gas, temperature, and humidity with additional sensors. This project demands overall unmanned system techniques such as indoor/outdoor navigation, guidance, obstacle detection and avoidance, environment modeling, path planning, etc.
Recently, there have been significant advances in self-driving cars, which will play a key role in future intelligent transportation system. In order for these cars to be successfully deployed in real roads, they must be able to drive by themselves along collision-free paths while obeying various traffic laws. In contrast to many existing approaches that use pre-built environment maps of roads and traffic signals, we propose system using a unified map contains not only the information on real obstacles nearby but also traffic signs and pedestrians as virtual obstacles. Using this map, the path planner can efficiently find paths free from collisions while obeying traffic laws.
EureCar is a Self-driving car that can drives itself along the pre-planned path while avoiding obstacles and obeying various traffic laws. We have been developing 2 Self-driving cars, EureCar and EureCar Turbo. EureCar (our first Self-driving car) has high precision positioning system, 7 laser scanner and 4 camera has been developing for 2 years. EureCar Turbo takes 6 months of development time for the same performance as EureCar with less and low-cost sensors. We could curtail the development period from 2 years to 6 months by using previous software developed for EureCar. Especially, it took only 2 months for developing additional software for EureCar Turbo except the period of hardware implementation and remodeling.
We are developing a 3D navigation algorithm to provide a MAV with the estimated pose information in real-time. The navigation algorithm is based on Monte Carlo Localization (MCL) using a particle filter. The particle filter calculates likelihood of particles using laser measurements and ray-casted ranges of the particles that uses an octree structure for fast computation. The MCL using a particle filter is implemented to estimate position and attitude of the vehicle.To validate the proposed indoor navigation algorithm, the experiment is conducted using a quad-rotor platform, a laser scanner, an IMU, a low-level flight controller, and a ground station computer. The onboard flight computer runs a low-level controller for position and attitude control. The attitude and position is controlled by a multi-loop PID controller and the guidance algorithm to follow waypoints or paths is also enabled. In addition, obstacle avoidance is available to deal with sudden interference of human for safety.