From https://news.ycombinator.com/reply?id=20561792&goto=threads%3Fid%3Dkragen%2320561792
Indeed, I think the finite amounts of existing dams are the reason people are looking to batteries.
Your point about the efficiency is interesting, although I didn’t understand it at first. I think you’re saying that the capital cost of the lithium-ion battery storage is partly defrayed by the higher efficiency of the storage system? Like, for each kilowatt-hour of Li-ion storage (with, let's say, a round-trip efficiency of 95%, although I think that's too high), you “get back”, say, 0.25 kWh every time you use it, that you would have lost if you'd stored that energy in pumped storage instead? So over, say, 15 years, you “get back” US$82 or so, at US$5.48 per year?
https://electrek.co/2018/11/20/tesla-gigafactory-battery-cells-made-cost-advantage-panasonic-lg-report/ claims that the battery cell cost is US$111 per kWh at the moment (though other manufacturers are still stuck around US$140), so that would work out to about a 5% annual IRR if the cells were the only cost; I think that in fact they are on the order of half the cost (though Tesla's blog post here doesn't actually list prices!) and so that would be a 2.5% or so IRR. Not enough to justify the battery investment on its own, but it would definitely be a significant boost to the project's ROI.
I have a couple of objections to that line of reasoning, one trivial and one serious.
The trivial objection is that the wholesale cost of electrical power, although it varies a lot, averages about half of the 6¢/kWh you're imputing. https://www.zmescience.com/ecology/climate/cheapest-solar-power/ talks about the just-signed Atacama project at 2.9¢/kWh, which I think includes the cost of some storage. So the numbers are more like 1.25% IRR rather than the 2.5% I suggested above or the 5% you suggest.
The more serious objection is that, when you're filling up your utility-scale storage during hours of excess power production, you're not paying 6¢/kWh or 2.9¢/kWh. In fact, due to non-dispatchable “baseload” plants like coal and nuclear, it's common right now for the power plant to pay you to take the power, with the price typically around -4¢/kWh, which is the cost of burning it up in giant resistors. When instantly-dispatchable solar plants come to dominate power production, we can expect to see a price floor of 0¢/kWh. Maybe if a storage-plant operator is paying a solar-plant operator to leave their PV plants running, they'll have to pay 0.01¢/kWh or 0.1¢/kWh. But they won't be paying anywhere close to the average price of electrical energy. They'll be paying the marginal price of generating electrical energy when it is cheapest.
So that means that the amount of money you make from a utility-scale energy storage plant isn't going to be determined by how much energy you need to charge it up. Your round-trip energy efficiency could be 10% or 5% and you still wouldn't pay a significant percentage of your revenues to obtain that energy. What determines your revenues is how much energy you can release once you are selling energy rather than buying it. (And the quality of your trading strategy, of course; if you decide to wait to sell your energy until your LMPs go above US$45/MWh, and they sit at US$42/MWh all night long, you don't make any money.)
Round-trip energy efficiency only matters at all in the sense that it diminishes your effective storage capacity — if theoretically you have “1 MWh” stored, but when you turn it on, only 0.9 MWh flows to the grid, you only get paid for that 0.9 MWh, and that's what you need to pay your capex and opex with. But it only matters very marginally whether you had to buy 1.1 MWh or 2 MWh or 5 MWh or 10 MWh to charge up your storage facility.