To be fair that’s literally just a waste of resources. If you want 128 random bits just get 128 random bits from the underlying source, unless your host langage is amazingly deficient it’s just as easy.
The global uniqueness of a uuid v4 is the global uniqueness of pulling 122 bits from a source of entropy. Structure has nothing to do with it, and pulling 128 bits from the same source is strictly (if not massively) superior at that.
Early IPCC reports, all the way up to AR5 basically threw their hands up when it came to permafrost emissions. They admitted we didn't have the necessary data yet and for the most part didn't account for it at all in their models
Check out the 1.5C special report. Go to section 2.2.1.2, last paragraph says
> The reduced complexity climate models employed in this assessment do not take into account permafrost or non-CO2 Earth system feedbacks, although the MAGICC model has a permafrost module that can be enabled. Taking the current climate and Earth system feedbacks understanding together, there is a possibility that these models would underestimate the longer-term future temperature response to stringent emission pathways
The claim being discussed is not that they didn’t account for it, but that they didn’t attempt to account for it. Reading that text, I think they did, but chose not to include it (I guess because they didn’t need to to make their point and, by not including it, avoided opponents from arguing about the validity of the result based on uncertainties in those models)
Any business which exports especially to Canada (because oddly between tariffs and repeated threats of invasion US products and services are not seen in a positive light), likewise any business up or downstream of mostly immigrant workforces.
I’ve become ambivalent about relaxng, I used it a bunch because I like(d) the model, and the “Compact” syntax is really quite readable, and it’s a lot simpler than XML schemas.
However the error messages, at least when doing rng validation via libxml2, are absolutely useless, so when you have a schema error finding out why tends to be quite difficult. I also recall that trying to allow foreign schema content inside your document without validating it, but still validating your own schema, is a bit of a hassle.
> By default, jit_above_cost parameter is set to a very high number (100'000). This makes sense for LLVM, but doesn't make sense for faster providers. It's recommended to set this parameter value to something from ~200 to low thousands for pg_jitter (depending on what specific backend you use and your specific workloads).
Oracle’s wasn’t but I haven’t used it in a very long time so that may not be longer be true.
The problem though was that it had a single shared pool for all queries and it could only run a query if it was in the pool, which is how out DB machine would max out at 50% CPU and bandwidth. We had made some mistakes in our search code that I told the engineer not to make.
reply