It depends on the manufacturer. The most common are 80% and 60% but I’ve seen 70% too.
I’ve never seen 90% as a 10% loss in capacity is very small and wouldn’t even be noticed by a lot of users in different applications. The manufacturer wouldn’t set an end-of-life spec that high as they could never sell them.
It would also leave the cells as almost-new at the end of their rated life, create huge numbers of cells needing recycling, and force the manufacturer to set very short warranties for their products.
No manufacturer would set 60% as the cutoff if it was dangerous or anywhere near it. That would be corporate suicide.
But any dendrite formation or physical damage that occurs over time can increase the risk of using the cell and both of those occur as the cell is used. I guess the capacity loss can be seen as a time tracker and the manufacturer may set the cutoff as a buffer against the cell’s dendrite formation but the cutoff wouldn’t be anywhere near a danger point. The cutoff is also set as a good indicator of where most people get too frustrated to use the cell anymore.
But both of those risk factors can also significantly increase with almost new cells, depending on how they are used.
There is a huge “second life” market that takes cells that have reached the end of their life and use them in lower power applications like energy storage (“power walls”). This wouldn’t exist if just capacity loss was a safety issue. I’m sure they test the cells for certain obvious warning signs but then they can deliver years of additional service.