Link Copied!

O Modo Escuro da Waymo: Quando a Rede Morreu

O recente apagão de São Francisco não apenas apagou as luzes; paralisou os robôs mais avançados da cidade. Por que a frota da Waymo congelou no escuro e o que isso revela sobre a frágil ligação entre a IA e a envelhecida rede elétrica.

🌐
Nota de Idioma

Este artigo está escrito em inglês. O título e a descrição foram traduzidos automaticamente para sua conveniência.

Waymo Jaguar I-Pace preso em um cruzamento escuro em São Francisco durante um apagão

San Francisco is supposed to be the city of the future. But on a Saturday afternoon in December 2025, it looked more like a scene from a pre-industrial past - with one glaring anachronism.

As 130,000 residents plunged into darkness following a massive substation fire at 8th and Mission, the city’s traffic infrastructure simply vanished. No green lights, no red lights, no pedestrian signals. Just black metal boxes hanging over intersections. For human drivers, this triggered a messy but understood social protocol: the “treat it as a four-way stop” rule. It was chaotic, aggressive, and slow, but it flowed.

The grid failure, however, was a cognitive catastrophe for Waymo’s autonomous fleet.

Across the Richmond, Presidio, and Downtown districts, the “World’s Most Experienced Driver” didn’t know what to do. Bereft of the deterministic certainty of a traffic signal, and unable to make eye contact with the confused driver in the Honda Civic creeping forward, the robots did the only thing their safety validation models allowed: they froze.

The result was, in the words of onlookers, “absolute mayhem.” Waymo vehicles sat idling at darkened intersections, their LiDAR arrays spinning furiously, waiting for a signal that would never come. This wasn’t just a technical glitch; it was a fundamental revelation about the fragility of autonomous systems in a crumbling analog world. The robots are ready for the road, but the road isn’t ready for them.

The Physics of the “Dead” Intersection

To understand why a power outage is more paralyzed for a robot than a blizzard, one must examine how an Autonomous Vehicle (AV) perceives “authority.”

The Deterministic Fallacy

An AV operates on a hierarchy of constraints. At the top of this hierarchy is the Traffic Control Device (TCD). P(Action)=P(PathSignal)×P(Clearance)P(Action) = P(Path | Signal) \times P(Clearance) In a normal scenario, the state of the traffic light (SignalSignal) is a binary or ternary constraint. Green means “Go,” Red means “Stop.” The probability distribution for the vehicle’s next action collapses to a near-certainty based on this signal. The computer vision system identifies the bounding box of the traffic light, classifies the pixel color (Red/Yellow/Green), and maps it to the HD Map’s semantic layer to confirm this light controls the vehicle’s lane.

When the power goes out, the TCD state becomes NULL.

For a human, a dark light is a symbol. It maps to the concept “4-Way Stop.” Drivers use Game Theory: they inch forward, wave, and look at the other driver’s face to assess aggression or yielding intent. Humans engage in a complex, micro-second negotiation based on social cues.

For an AV, a dark light is an edge case of extreme uncertainty.

  1. Detection Failure: The camera sees the housing but no illuminated pixels. Is it off? Is the sun glaring off it?
  2. Rule Conflict: The HD Map says “This is a signalized intersection.” The sensors say “No signal exists.”
  3. The Minimum Risk Condition (MRC): When the uncertainty threshold (σ\sigma) exceeds safety parameters, the vehicle defaults to its Minimum Risk Condition. Usually, this means “Stop and wait for clarity.”

In the December 20 blackout, “clarity” never arrived. The robots waited for a signal change that physics could not provide.

The Sensor Gap: Why LiDAR Can’t See “Go Ahead”

Observers might ask, “Why not just code the AV to treat dark lights as stop signs?”

The challenge is intent prediction. At a 4-way stop, the right of way is determined by arrival time and geometry. But in a power outage chaos scenario, humans cheat. They roll through stops, go out of turn, and herd together.

Waymo’s perception stack uses LiDAR (Laser Imaging Detection and Ranging) and Radar to track objects.

  • LiDAR gives precise distance (dd) and velocity (vv).
  • Cameras give object classification.

Sensors cannot detect a police officer’s hand signals or a driver’s nod. Next-Gen AVs lack the capabilities to negotiate a lawless intersection.

When the grid is removed, the rules are removed. And robots cannot improvise.

Contextual History: The Pattern of Paralysis

This implies a concerning trend. The “Waymo Freeze” during the blackout is not an isolated incident; it is part of a pattern where AVs struggle with Contextual Ambiguity.

The Cone Incident (2023)

Recall the “Cone Week” protests, where activists placed traffic cones on the hoods of Waymo and Cruise vehicles. The vehicles became immobilized. Why? Because the perception stack classified the cone as an “Occlusion” or “Obstacle” attached to the vehicle or in its critical path. The logic loop entered a deadlock:

  1. Obstacle Detected.
  2. Cannot move until obstacle clears.
  3. Obstacle moves with the car.
  4. Result: Stop.

The Fog of War (2024)

During heavy fog in San Francisco last year, Waymo vehicles pulled over en masse. This was a safety feature - LiDAR performance degrades in scattering media - but it resulted in blocked driveways and streets.

The Blackout (2025)

The December 20 blackout is the most severe because it wasn’t a sensor interaction issue; it was an Infrastructure Dependency failure. The vehicles were perfectly functional. Their batteries were charged. Their sensors were clean. But the world broke.

This highlights a critical vulnerability in the widespread deployment of AVs: Interdependence. The industry is layering a 21st-century digital AI transport layer on top of a 20th-century electrical grid that is barely holding on.

The Infrastructure Dependency: The Grid is the Graph

Cybersecurity experts often talk about the “kill chain” in security. In autonomous mobility, there is a “dependency chain.”

  1. Level 1: The Vehicle (Hardware, Tires, Battery).
  2. Level 2: The Connectivity (LTE/5G to mapping servers/tele-ops).
  3. Level 3: The Infrastructure (Traffic Lights, Street Lamps, Road Markings).

The 2025 blackout broke Level 2 and Level 3 simultaneously.

The Tele-Ops Bottleneck

Usually, when a Waymo gets confused, it “phones home.” A Remote Assistance (RA) agent looks at the camera feed and gives a high-level command like “Nudge forward” or “Ignore this signal.”

But the PG&E substation fire likely degraded local cell towers. Even if towers had backup batteries, the localized congestion (thousands of humans calling relatives) would crush the bandwidth. LatencyUsersBandwidth\text{Latency} \propto \frac{\text{Users}}{\text{Bandwidth}} If the Waymo cannot reach the RA server due to network congestion, and it cannot resolve the scene locally due to safety constraints, it becomes a 5,000-pound brick. It is a brick that follows the rules, but a brick nonetheless.

Forward-Looking Analysis: Can This Be Fixed?

The “Blackout Problem” must be solved before mass scale is reached. If 10% of SF traffic was autonomous during this outage, the gridlock would have blocked fire trucks and ambulances, turning a nuisance into a tragedy.

Solution A: Mesh Network V2V

Vehicle-to-Vehicle (V2V) communication could allow the fleet to “vote” on an intersection state.

  • Concept: If Car A sees a dark light and stops, and Car B (facing cross traffic) sees a dark light and stops, they can shake hands digitally.
  • Protocol: A digital handshake confirming position and state, allowing coordinated movement.
  • Reality: This requires a universal standard (V2X) that all OEMs (Tesla, Rivian, Waymo) agree on. The industry is years away from this.

Solution B: “Lawless Mode” (The New York Cabbie Update)

AV developers may need to train a specific “Uncontrolled Intersection Policy” that is more aggressive.

  • Logic: If TCD = NULL for > 30 seconds -> Treat as Stop Sign -> Creep to center -> If no high-velocity incoming vectors -> Force merge.
  • Risk: This increases the P(Collision)P(Collision) non-linearly. But the alternative (total paralysis) has its own risk profile (blocking emergency services).

Solution C: Infrastructure Hardening

The boring but real answer: Battery Backups for Traffic Lights. In major corridors, LEDs use very little power. A small solar + battery retrofit could keep the signal logic running for 24 hours during a grid failure. This is cheaper than retraining an AI to understand human negotiation, yet city budgets rarely prioritize it.

The Verdict

The Waymo fiasco during the San Francisco blackout wasn’t a failure of Artificial Intelligence; it was a collision between AI and Entropy.

Engineers have built machines that drive with the mathematical precision of a chess master. But the real world, especially during a disaster, isn’t chess. It’s a mosh pit. Until these robots learn to push, shove, and negotiate the messy, unwritten rules of broken infrastructure, they will remain fair-weather drivers - brilliant when the lights are on, but paralyzed when the city goes dark.

Observers spotting a robotaxi at a dark crossing should not expect a signal. The machine is waiting for a green light that isn’t coming.

Sources

🦋 Discussion on Bluesky

Discuss on Bluesky

Searching for posts...