Andrew Miller at Asterisk:
Picture a fall afternoon in Austin, Texas. The city is experiencing a sudden rainstorm, common there in October. Along a wet and darkened city street drive two robotaxis. Each has passengers. Neither has a driver.
Both cars drive themselves, but they perceive the world very differently.
One robotaxi is a Waymo. From its roof, a mounted lidar rig spins continuously, sending out laser pulses that bounce back from the road, the storefronts, and other vehicles, while radar signals emanate from its bumpers and side panels. The Waymo uses these sensors to generate a detailed 3D model of its surroundings, detecting pedestrians and cars that human drivers might struggle to see.
In the next lane is a Tesla Cybercab, operating in unsupervised full self-driving mode. It has no lidar and no radar, just eight cameras housed in pockets of glass. The car processes these video feeds through a neural network, identifying objects, estimating their dimensions, and planning its path accordingly.
More here.
Enjoying the content on 3QD? Help keep us going by donating now.
