Hacker Newsnew | past | comments | ask | show | jobs | submit | bob1029's commentslogin

Valve originally devised this technique in 2007:

https://steamcdn-a.akamaihd.net/apps/valve/2007/SIGGRAPH2007...


I originally had that near the top but in one of the rewrites I moved the papers to the end — https://www.redblobgames.com/articles/sdf-fonts/#appendix

> BTW, I think a lot of people were/are greatly overestimating the value of coding to business success.

I've frequently argued to my organization's leadership that the product could be open source on GitHub with a flashing neon sign above it and it wouldn't change anything about the business. A competitor stealing our codebase would probably be worse off than if they had done anything else. Conway's law and all that.


The problem wouldn’t be your competitors cribbing your ideas, it would be more like letting anyone with a bone to pick audit you for minor compliance violations, customers relying on internal implementation details or judging you unfairly for legacy horrors, or devs getting self conscious about their sloppy 2am fix and prolonging an outage for rational public image/ego reasons

I am starting to believe that OAI might actually succeed at getting per token inference cost to where it needs to be. Or that it's already there in principle.

Wafer scale compute is a very big deal. Most of HN is probably still unaware that you can get tokens out of one of these devices right now via public API offerings.


What are the chances this isn't intentional to some extent? This wouldn't be the first time we've traded downstream legal trouble for short term gains.

Making AI utilization appear to go up is the only thing that matters right now if you're in the boardroom at one of these companies. Whether or not that utilization was actually intended by the customer is entirely irrelevant. From here, the only remaining concern is mitigating legal issues which google seems to be immune to.


Does anyone really believe something like this?

There's a long stretch from over optimizing a UI to something that is very clearly an error like what has happened here.


I save $20/mo on my internet by having cable that I don’t watch. Why? So my telecom company can boast higher tv subscriber counts to shareholders and ad-networks.

It is entirely believable to me that a company like Google would do the same with AI use numbers. I suspect that all these AI use factors in corporate performance reviews are about the same thing.

This could be a standard oversight too, I find Google’s documentation on this stuff to be Byzantine.


Tax ids were never meant to be used as a form of global identification. If you go look in a real bank core, you'll find this field does not have any uniqueness constraints.

Why not? Two people with the same tax ID seems like a problem waiting to happen.

These schemas also support non-individuals with the same fields. EINs have a lot more edge cases than SSNs.

Out of all of the cloud providers, I find Microsoft's authentication stack to be the most legible and stable. Everything else really sucks though.

You know things are bad when Microsoft is the most stable...

As someone who has used very many "cloud providers" (including GCP, AWS, and Azure), it cannot be said that Azure is the most stable. GCP is far better for stability and reliability than Azure.

The extensive experience with Enterprise Authentication that the decades of use of Active Directory has given Microsoft may mean that their SSO and Enterprise Authentication stuff is the best out of those on offer. I wouldn't know about that... I just made (and destroyed) VMs and was often driven to frustration whenever Azure failed to reliably perform that simple task.


I think something like OAuth might help here. Modeling each "claw" as a unique Client Id could be a reasonable pattern. They could be responsible for generating and maintaining their own private keys, issuing public certificates to establish identity, etc. This kind of architecture allows for you to much more precisely control the scope and duration of agent access. The certificates themselves could be issued, trusted & revoked on an autonomous basis as needed. You'd have to build an auth server and service providers for each real-world service, but this is a one-time deal and I think big players might start doing it on their own if enough momentum picks up in the OSS community.

Vogtle is probably producing the most electricity out of any generating plant in the US once you consider capacity factor.

Vogtle is also the most expensive electricity in the world, the only electricity costing more than $10,000 per kW.

And on the other end of the spectrum, grand coulee would be ~$1,500/kW in todays dollars.

Those are very different metrics.

edit: Parent got edited; it was talking about $0.02/kwh initially.


Vogtle won't stay the most expensive. My idiotic government (Ontario, Canada) is committing to building a new nuclear plant. $400 billion for 10GW, and that's before the inevitable delays and cost overruns. Maybe we'll break the $100,000 per kW mark!

My worst technology experience of all time was maintaining support for a Zebra label printer in VB6. I can assure you that the users of these printers had maybe 1% the cortisol response I did when something went wrong.

Designing software for a printer means being a very aggressive user of a printer. There's no way to unit test this stuff. You just have to print the damn thing and then inspect the physical artifact.


Worked on printer firmware, can confirm.

"If it looks good, it is good." was a mantra


A million years ago I worked on some code which needed to interface with a DICOM radiology printer (the kind that prints on transparency film). Each time I had to test it I felt like I was burning money.

I'd like to see a histogram of my HN em dash usage over time. Maybe someone could get bored and visualize the 2nd order effects described here.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: