Autonomous vehicles, also known as self-driving cars/trucks/&c., are vehicles that can make steering, pedaling, or braking decisions to some extent. They are notable for being a present-day case where alignment and AI safety are important - for example, AI should not crash a car in order to gather data about car crashes.
In addition, autonomous vehicles present important ethical questions - for example, how much should a self-driving car focus on the safety of passengers, as opposed to people outside of the car? If autonomous vehicles become significantly less accident-prone than (and approximately as cheap as) human-driven ones, should government policies require people to use them for public safety?
Note: posts specifically about autonomous vehicles should be given higher relevance than ones that simply use them as examples.