The photo shows the car getting a 16kW charge rate while driving at 5mph. So you could get 16kWh of energy if you drove on this kind of road for an entire hour, at 5mph. Pretty worthless.
I’d love to know what sort of charge rate you’d get while driving at 35 mph, the standard speed on surface streets. Based on what little I know about inductive charging, I’d guess that the charge rate will be dramatically slower.
But if you get even 5kW while driving at 35 mph, I think that’d actually be pretty useful. Assuming that you get a significant efficiency increase at such a speed (5 mi/kWh seems reasonable), then going 35mph would use up 7KwH of energy per hour of driving. If you can restore 5 of those kWh from the roadway itself, you’re getting back over 70% of your energy usage while you drive.
The question then, however, is “How efficient is the energy transfer?” Because I doubt it’ll be very good. It’ll be rather expensive to pay for that “charge while you drive” power if you’re only getting 30-40% efficiency (at a guess). Some folks might be willing to pay that premium, but by the time this tech is ready to go to market, I imagine most people won’t really feel the need to “charge while you drive”.
Uh… no? Nobody drives at 5mph for more than a few hundred yards in a parking structure. That’s my point in that sentence, and why I went on to speculate on a possible real-wold use case, using many assumptions that may turn out to be wildly unrealistic.