I.E. 100% would be a good number as it would mean ZERO real people are actually listening to this garbage.
Unfortunately with the lack of foresight by the worlds' regulatory agencies that allowed all the copyrighted content to used for training, AND allowing these products to generate works that people can fraudulently pass off as their own (and hence monetize), fighting what you're talking about with the fake music and fake streams may end up come down to the streaming services (and as you say, consumers) taking actions, and being legally liable. Good news is that Spotify does not want to pay royalties for fake streams generated by not-real users, so they at least they have a vested interest and I suspect will take considerable steps to stop that from happening.
In fact "but not enough to trigger detection systems tuned for high-volume replay." indicates they already do so, because it's not JUST AI-generated music that this "bot streaming" happens with, it happens with real artist's music as well. The difference in that case though is there's usually "known entities" that are 'responsible' for the content i.e. the license holders, so there's someone to punish if that's happening, i.e. they sort-of know who they're sending the checks to. Unlike some Russian content-farm.
Eventually these AI companies AND streaming services (and YT and similar) need to be held accountable for allowing anyone to monetize (largely) AI-generated works, if that AI was trained on copyrighted material.
The lawsuit from the book publishers you mentioned recently is going to be a big bellwether on the front against the AI companies themselves, but if that case is lost? Then we'll have to move on to trying to hold the companies hosting the materials AND paying people when it gets viewed legally responsible.