First, I want to thank everyone who took part in our survey about the channel rebranding. It is very valuable to receive feedback from you and everything that was written, I will take into account in the rebranding, which we will spend in the spring. Who has not yet participated – go to the “community” tab – I will be glad to read your opinion. Now to the topic of release. Absolutely all futurologists and analysts believe that in the near future the profession of the driver will disappear. Let’s see if this future is so close, how autonomous cars work now and what they can actually do. Like the video and subscribe to the channel. Go! There are 6 levels of autonomy cars. And if the zero level is a maximum of ABS or parking sensors, then from the third one there is a limited autopilot capable of operating independently, but strictly within the framework of separate scenarios. The driver can go about his business, not grabbing the wheel every half a minute. The fourth level – autopilot in cities. All that is needed for complete autonomy for such a drone is a 3D map of the area. If you go to where the drone can not pick up a map from the base, then it will not go further. And finally, level 5 is a perfect autonomous car that can move in any weather and in any part of the planet. It is equal to an experienced live driver. How do real autonomous cars work? The first way to perceive space is the drone cameras. The obtained data is processed by a special processor, which evaluates the image, classifies it and transmits it to the “brain” of the machine, which makes a decision about the optimal behavior. Such a work cycle, for example, with a Visconti 4 processor lasts up to 50 ms. The reaction time of the driver is at least 500 ms. During this time, the car traveling at 80 km / h will cover 11 meters – a long distance in case of a dangerous situation on the road. But the cameras are not able to recognize distant objects and build detailed maps, besides their functionality depends on the weather. These deficiencies can be compensated by radars emitting radio signals with a frequency of tens of gigahertz. They do an excellent job with localizing objects, but without defining their shape and only in a narrow range. Lidars are considered the most effective “eyes” of UAVs. They build a detailed 3D map of the world tens of meters around them with inaccessible accuracy for other sensors, recognition of cars, people and any obstacles. But lidars have flaws. Firstly, they are helpless under heavy rain or snow – laser beams are reflected from drops and snow flakes. Secondly, the lidar should have a full circle view, which means that it should be located directly on the roof of the car, which is simply ugly. Imagine Tesla Model S with such a hump on the top! And thirdly, the lidars are not just expensive, but very expensive. So how do real drones work? For the implementation of autopilot in Tesla cars, a system of eight cameras with different angles and range of view, 12 ultrasonic sensors 12 ultrasonic sensors in a circle and long-range frontal radar are installed. Ultrasonic sensors are responsible for recognizing machines in adjacent rows and obstacles when driving at low speeds. Cameras are responsible for finding pedestrians, cars, markings and signs. Helps them in this radar. For the route, GPS is used, and the sensors ensure that the car goes strictly along the lanes and avoids accidents. On the one hand, it allows using Tesla autopilot in any cities. On the other hand, the autopilot still requires the attention of the driver to operate. In Tesla, the lidar is deliberately not used, Ilon Musk openly opposes the lidars, justifying this with their price and problematic work in bad weather. It’s hard not to agree with him – an additional 7-10 thousand dollars to the price and a “hump” on the roof would not add Tesla to its attractiveness. No matter how good the bundle of cameras, radar and ultrasonic sensors looks, and they have failures. In 2018, the Tesla Model S crashed into a road divider in autopilot mode, causing the driver to die. As the investigation showed, the Tesla autopilot could not read the erasable markings correctly, and the cameras and radars, in turn, did not see the danger in the approaching steel barrier. A good example is the Waymo system. The Waymo systems use a lidar, five radars, eight cameras and GPS, and Chrysler Pacifica Hybrid and Jaguar I-PACE cars are chosen as serial commercial carriers. When moving, the Waymo system uses Google Street View data, checking it with its sensors. Due to this, complete autonomy is achieved – unlike Tesla, Waymo cars really do not require driver intervention, but simply carry passengers. Unlike Tesla, Waymo does not sell cars, but the transportation service, that is, robotaxi. Panoramic video of Waymo helps you understand how a stand-alone car recognizes the surrounding space. The main disadvantage of Waymo is the extremely limited list of cities where the drones operate – for the correct functioning of the autopilot the urban environment must be filmed in 3D, and this is a long and complicated procedure. Yandex presented its unmanned vehicle project just 1.5 years ago. The Toyota Prius was equipped with a block of lidar, cameras, radar, GPS and IMU, that is typical for autonomous car components. From Yandex, the drone got a software platform that showed itself well when driving along the cramped streets of Moscow’s Khamovniki district, and on a long journey from Moscow to Kazan. The car of Yandex that reached Tatarstan remained there, becoming the first unmanned taxi in Russia. He now works in the territory of Innopolis. From October 2018 a similar taxi appeared on the territory of Skolkovo. In the future, the company intends to bring unmanned taxis to the streets of cities on a commercial basis. In 2016, the State Institute of NAMI showed the concept car of the unmanned “minibus” SHUTTLE. Two years later, KAMAZ-1221 ShATL was announced as a future serial project, the launch of which is scheduled for 2022. The mini-electric bus with lidars, cameras and ultrasonic sensors is still gently moving at a speed of 10 km / h, but as the software platform improves, they promise to tighten the speed to 110 km / h. Summarize. The exclusion of the human factor will allow to increase the maximum speed, reduce the width of the lanes, reduce the distance between cars in the stream due to the exchange of data between the drones with mutual warning about future maneuvers. As a result, the throughput of roads will significantly increase, the average speed will increase and the number of jams will decrease. But we are still far from such a future. Serial cars are only getting ready to step over to the second or third level of autonomy. It’s too early to dream of fifth-level drones. But their time will surely come. Perhaps much earlier than we expect.