본문 바로가기

autonomous driving

Image Frustum to Global 3D # generate camera frustum h, w = self.cfg['image']['h'], self.cfg['image']['w'] n_cam, dim, downsampled_h, downsampled_w = feat.size() # Depth grid depth_grid = torch.arange(1, 65, 1, dtype=torch.float) depth_grid = depth_grid.view(-1, 1, 1).expand(-1, downsampled_h, downsampled_w) n_depth_slices = depth_grid.shape[0] # x and y grids x_grid = torch.linspace(0, w - 1, downsampled_w, dtype=torch.f.. 더보기
nuScenes 데이터셋의 이해 from nuscenes.nuscenes import NuScenes nusc = NuScenes(version='v1.0-trainval', dataroot=dataroot, verbose=True) 1. Scene nuScenes는 20초 가량 길이의 scene이 1000개가 존재. 이 중 850개만을 공개. # load first scene my_scene = nusc.scene[0] 2. Sample scene에는 여러개의 sample이 존재. 하나의 sample이 특정 시각에 수집된 각종 데이터 (camera, lidar, radar 등)를 의미. nuScene은 annotation을 2Hz로 했으므로, 두 sample 사이의 시간차는 약 0.5초 이며 하나의 scene에는 약 40개의 samp.. 더보기
nuScenes devkit 명령어 모음 (useful commands) 1. How to get scene records for scene_record in self.nusc.scene: yield scene_record['name'], scene_record 2. How to get sample records from scene record sample_token = scene_record['first_sample_token'] while sample_token: sample_record = self.nusc.get('sample', sample_token) sample_token = sample_record['next'] 3. How to get egolidar pose from sample record lidar_record = self.nusc.get('sample_.. 더보기
[GTA5/GTAV] End-to-end learning for autonomous driving In the previous post (visit 2017/07/28 - [TORCS] - End-to-end learning for autonomous driving), I showed you a racing car that controls its steering wheel angle according to CNN output. (The input to the CNN is the resized front view camera image and the output is the best steering wheel angle that makes the car keep its lane.) In this post, I will show you a car that runs on the road in GTA5!!!.. 더보기
Self-driving car that learns from human drivers - part 2 In the previous post (see 2017/07/31 - [TORCS] - Self-driving car that learns from human drivers - part 1), I show you the racing cars that drive according to a simple driving policy. The driving policy described in the post consists of several algorithms, each utilizes the information about the status of the car and the road. For example, the anti-collision breaking system utilizes 'the speed o.. 더보기
Self-driving car that learns from human drivers - part 1 ** TORCS is an open-source freeware car racing simulator that is available for Windows, Linux, Mac OS X, Amiga, FreeBSD, MorphOS, and AROS. Visit http://torcs.sourceforge.net/ for more details. ** If you want to know easy way to install TORCS on Ubuntu, visit 2017/07/18 - [TORCS] - How to install TORCS on ubuntu 16.04LTS. I am on a project "teaching a car how to drive". As a preliminary work, I .. 더보기
End-to-end learning for autonomous driving TORCS, the open racing car simulator, is an interesting tool for A.I racing. (If you want to know how to install it, see 2017/07/18 - [TORCS] - How to install TORCS on ubuntu 16.04LTS). One can make a racing robot that runs on TORCS tracks according to his/her own driving policy. (Very specific information such as speed, position, track info. etc. can be obtained in real time!! So you can make a.. 더보기