There's no free lunch™, so from what I can tell there's some pathway loss here. E.g. some Jacobi trajectories definitionally exclude higher temperature paths. Which might actually be a positive given data retrieval (but a negative if we want to maximize for creativity?).
There are better and worse algorithms. I'm not sure "there is no free lunch" always applies in a particularly meaningful way. Some things aren't on the pareto frontier.
I would go even further and say there isn't any indication that we are even close to what is possible. My subjective feeling is that with the current rate of progress it is entirely possible that we will have GPT-4 level performance locally on smartphone hardware within 3-10 years (unless companies decide again that they don't want to give this kind of power away)
Probably. Advancements in ML algorithms, like this one, have been outpacing advancements in hardware for awhile now, so both are converging on making ML faster and ubiquitous.