Obstacle sensors, radars and eventually, self-driving cars: The road to Advanced Driver Assistance Systems
Context- There is a progressive democratisation of autonomous driving tools, with car manufacturers starting to now offer advanced driver assistance systems (ADAS) as standard bells and whistles on their mid-segment vehicles,
The new Verna, Hyundai’s upgrade of its flagship sedan, now comes equipped with front and rear radars, sensors and a front camera to allow for what is called ‘Level 2 ADAS’ functionality, meaning that it will not just detect obstacles on the road or issue a warning in the event of an unusual departure of the car from a designated lane, but also initiate corrective actions.
(Credits- Rohm.com)
The ADAS suite will also include features such as automatic emergency braking, forward collision warning, blind spot collision warning, blind spot collision-avoidance assist, lane-keeping assist, driver attention warning and adaptive cruise control, which can enhance the safety and convenience of driving.
What’s behind the increasing offerings of ADAS tech?
- While this progressive trend of moving down the price bracket is being driven by several factors, including the increasing demand for safer vehicles among Indian consumers and the government’s push for increased road safety, the availability of more affordable ADAS technology is also accelerating this trend.
- With the adoption of ADAS technology, car manufacturers say they are helping to improve the overall safety of Indian roads and reduce the number of accidents and fatalities. But ADAS level 2 is where the self driving goal of most carmakers seems to have maxed out, at least for now, despite lofty promises over the years.
So what exactly is autonomous-driving?
- There are five levels in the evolution of autonomous driving: each level describing the extent to which a car takes over the responsibilities from the driver, and how the two interface. So, the levels range from 0 to 5, progressively defining their relative extent of automation.
- Level 0, “No Automation”, is where the driver controls the car without any support from a driver assistance system – the case for most cars on the road currently.
- The driver assistance systems of level 1, like adaptive lane assist or parking assist, are already being offered in a number of top-end cars, while level 2 is a further upgrade that was available only across some models of premium car makers such as steering and lane-keeping assist and remote-controlled parking – examples include Tesla’s ‘Autopilot’ or BMW’s ‘Personal CoPilot’. This is what Hyundai is now offering with its new Verna and Honda with the new City.
- Level 3 is where it starts to get tough for carmakers – who have to offer an even greater array of “automated driving” tools where the driver can partly take his eyes of the road.
- Level 4 stands for “Fully Automated Driving”, where the driver can take his hands off the steering wheel for most of the drive.
- Level 5 is “Full Automation”, where the car can drive without any human input whatsoever.
Why is the self-driving objective still elusive?
- When Musk first made his claim over a decade ago, Tesla had been working on its driver assisted technology, which they have branded as autopilot, for about 12 months or so. Success has been patchy ever since.
- Problems ranged from jumping red lights, not recognising pedestrians to situational problems like identifying a cyclist who briefly disappears behind a parked vehicle. Ultimately, Tesla began beta testing its full self-driving system only in 2020. But there’s a catch with Tesla’s beta testing: it is doing the testing with regular people driving their Tesla cars on public roads.
- Google’s Waymo and Cruise owned by General Motors are among companies that predicted they’d have full self-driving cars by 2020, with only limited success: that too limited to ring fenced, geotagged areas. But none of them are anywhere close to reaching level 5.
- Clearly, developing full self-driving cars is way harder than carmakers initially thought. Also, there is the continuing debate on the technology of choice: cameras vs a combination of technologies that include lidar, radar, sensors and camera.
- While Tesla is relying primarily on cameras, most other car models depend on multiple sensors to feed information into onboard computers, which expends huge amounts of processing power to map what’s going on and forecast what might happen next.
- While it is easy for humans to do this kind of prediction based on situational awareness, this can present a complex problem for computers, such as when someone at the side of the road steps off the pavement and disappears behind a parked van for a minute or distinguishing between a red signage on a wall versus an actual stop sign.
- But for a computer, this predictive capacity is still work in progress. For self driving to work in a country like India, where traffic is extremely haphazard, it will be an even tougher task.
What’s next in terms of the milestones for self-driving?
- In 2022, eight years after his first prediction, Musk highlighted the real scale of the problem – that to solve full self-driving, the issue of a mastery over real world AI is a must, which is still a work-in-progress.
- The rapid advances being made in large language models is a somewhat positive sign, going forward.
Conclusion- Transition towards full automation remains work in progress. However, as advancements in AI continue to take place, fully automated self driving can be achieved in near future.
Syllabus- GS-3; Science and Tech
Source- Indian Express