Link Copied!

Der Blackout-Test: Warum Waymo einfror und FSD nicht

Als San Francisco dunkel wurde, fror die Waymo-Flotte ein, während Teslas FSD das Chaos meisterte.

🌐
Sprachhinweis

Dieser Artikel ist auf Englisch verfasst. Titel und Beschreibung wurden für Ihre Bequemlichkeit automatisch übersetzt.

A frozen Waymo vehicle in a dark San Francisco intersection while a Tesla drives past during a blackout.

At 11:00 PM on December 20, 2025, a massive underground cable fault plunged San Francisco’s Mission District into total darkness. Streetlights died. Traffic signals went black. 130,000 residents lost power. And in the middle of this urban void, something revealing happened.

Dozens of Waymo’s Jaguar vehicles simply stopped.

Frozen in travel lanes, hazard lights pulsing in the gloom, they became 5,000-pound obstacles that required human rescue crews to manually retrieve. Meanwhile, reports and footage rapidly emerged of Tesla vehicles, running FSD v14, navigating the same pitch-black intersections with eerie confidence. They treated the dead traffic lights as four-way stops, negotiated with human drivers, and continued their journeys.

This is Waymo’s “Code Red” moment. This incident demands a rigorous technical examination of why one architecture failed where the other thrived. The answer lies in the fundamental difference between Map-Dependent Priors and End-to-End Vision, and the hidden fragility of infrastructure-tethered autonomy.

The Architecture Gap: Maps vs. Vision

To understand the failure, one must understand the decision-making hierarchy of the Waymo stack, as detailed in the analysis of the grid failure paralysis. Waymo relies heavily on High-Definition (HD) Maps: centimeter-perfect pre-scanned digital twins of the city. These maps serve as the “ground truth” against which sensor data is compared.

The Prior Probability Trap

In a typical Waymo architecture, the vehicle possesses a strong “prior” belief about the world. It knows exactly where every traffic light, stop sign, and lane marker exists. P(StateMap,Sensor)P(\text{State} | \text{Map}, \text{Sensor}) When the grid fails, the “Sensor” data (darkness) conflicts violently with the “Map” data. The map asserts the existence of a controlled intersection with active signals. The sensors report a void.

Crucially, LiDAR (Waymo’s primary sensor) works perfectly in the dark. It can see the geometry of the intersection, the other cars, and the pedestrians. However, LiDAR cannot see the color of a traffic light; it relies on cameras for semantic state detection. When a traffic light loses power, it does not turn red; it effectively disappears from the semantic understanding of the machine.

For a system explicitly programmed to obey traffic signals hard-coded into its geofenced reality, the absence of a signal state is a “Critical System Fault.” The safety policy is deterministic: if the state of a known traffic control device cannot be verified, the vehicle must enter a Minimum Risk Condition (MRC). In this case, that meant stopping immediately in the lane. This is the “Brittle Edge Case.” The system is safe, yes; but bricking itself is safer than running a blind intersection, but it is not resilient. It requires the infrastructure to match the map.

Tesla’s Vision: The “No-Priors” Approach

Tesla’s approach, specifically with the End-to-End Neural Networks introduced in v12 and refined in v14, operates on a fundamentally different principle. It does not rely on a map to validate the existence of a traffic light. It relies on the raw photon stream hitting the camera sensor.

The network is trained on petabytes of video data. It has “seen” broken traffic lights, dark roads, and power outages in its training set, derived from the millions of human-driven Teslas operating globally. When it approaches a blacked-out intersection, the network does not query a database or check a rulebook. It processes the visual texture of the scene directly. f(Pixels)Control Policyf(\text{Pixels}) \rightarrow \text{Control Policy} Because the lights were out, the “visual context” matched the latent space representation of a “broken signal” or “uncontrolled intersection” scenario. The policy output mirrored human intuition: slow down, creep forward for visibility, check for cross-traffic movement, and proceed when clear. It solved the problem locally, in real-time, without needing the city’s infrastructure to validate the scenario.

The Connectivity Bottleneck

A secondary, often overlooked factor in the Waymo failure is the reliance on “Remote Assistance” (RA). When a Waymo vehicle encounters an unknown scenario, it calls home. A human operator reviews the sensor feed and provides a high-level instruction (e.g., “Proceed through intersection”).

However, a massive power outage often correlates with cellular network congestion or failure. If the local cell towers lose power or are overwhelmed by 130,000 residents checking their phones simultaneously, the RA link is severed. Autonomy=Onboard Compute×Connectivity\text{Autonomy} = \text{Onboard Compute} \times \text{Connectivity} If connectivity drops to zero, a cloud-dependent system drops its IQ significantly. Tesla’s inference is strictly local to the FSD computer (HW3/HW4). It requires no cellular connection to make a driving decision. This decoupling from the “Cloud” is a strategic advantage in disaster scenarios where infrastructure (both power and data) is compromised.

Historical Context: The Shadows of Cruise

This incident echoes the collapse of Cruise in late 2023. Cruise vehicles notoriously stalled en masse when connectivity failed or when complex scenarios overwhelmed the planner. That failure led to the revocation of their operating permit in California.

While Waymo has historically been far more robust than Cruise, the San Francisco blackout reveals they share the same architectural DNA: the “Robotic” approach. The Robotic approach treats driving as a series of distinct rules (If Red -> Stop). The “imitation learning” approach (Tesla) treats driving as a behavioral flow.

The cost of this fragility is not just reputational; it is operational. Retrieving 50 stranded vehicles requires a massive logistical deployment. It blocks emergency responders, such as fire trucks and ambulances, trying to navigate the blackout. This creates a negative feedback loop with city regulators. If the “solution” to a blackout is to create a traffic jam, the city may reconsider the permit.

The Scalability Equation

Investors and engineers must look at the math of scalability.

  1. Waymo Model (Infrastructure-Tethered): To expand to a new city, one needs HD Maps, reliable 5G/LTE coverage, and a stable power grid. If any of these variables fluctuates, the fleet creates a “denial of service” attack on the city’s roads.
  2. Tesla Model (General Purpose): The vehicle navigates based on immediate visual reality. It is agnostic to the city, the map updates, or the status of the grid, provided it has headlights and traction.

The San Francisco blackout suggests that “General Purpose” autonomy has a higher ceiling for reliability in chaotic environments compared to “Over-Fitted” autonomy. Waymo has “over-fitted” its solution to the specific, functioning state of San Francisco. When that state changed, the fit failed.

The Road Ahead: Code Red for Strategy?

Is this fatal for Waymo? No. Engineering teams will patch the software. They will update the priors to handle “detected power loss” scenarios. They will improve the logic for “uncontrolled intersections” inferred from dark signals.

But it is a Code Red for the strategy.

It demonstrates that one cannot brute-force autonomy by mapping every inch of the world, because the world is entropic. Bulbs burn out. Grids fail. Construction alters geometry overnight. If a robot needs the world to stay still, it will eventually freeze.

The future of autonomy requires resilience. It means operating when the server is down, when the GPS is jammed, and when the lights go out. Yesterday, in the dark streets of the Mission, one system looked like a fraglie science project, and the other looked like a competent driver.

The grid will fail again. The ultimate test for Waymo is not how well it drives on a sunny Tuesday, but whether it can adapt when the unpredictable chaos of the real world turns the lights out. Until then, the advantage goes to the system that learns from chaos rather than fearing it.

Sources

🦋 Discussion on Bluesky

Discuss on Bluesky

Searching for posts...