I recently wrote a script that encodes an image to fall within a size range[0]. After toying with it, I noticed that smaller AVIF files are completely fine for web use, but identically sized JPEG XL files are not. Given ubiquitous browser support for AVIF[1], unless JPEG XL gets much better at smaller sizes, I reluctantly agree that Chrome's call to drop JPEG XL is the right one.
In what environment do you work where you need such low quality images though? In my web environment I only want the highest quality I can get at a reasonable size and I've never been interested in slightly less awful looking tiny images. In another comment I wrote about using jpegli at its default distance of 1 for everything and being happy with that size, so maybe I work in a completely different environment to you.
A normal one like everyone else I guess. No need to waste bandwidth and storage if you don't have to. If an image looks good to me, I'll try going lower until it doesn't look good, then go back one step. I've been surprised many times by just how low it can go and still look good. That script I wrote defaults to what I've settled on. (AVIF between 0.5 and 1 bpp at 1 megapixel, increasing or decreasing by square root of total pixels in image, plus JPEG fallback.)
If I change my mind, I keep high resolution originals of everything to do it again.
Apple supporting it surely has to be a signal to begin wider adoption!?