Seems like there's a good argument to be made that we'll have plenty of opportunities for valuable growth and learning, just about different things. Just like it's always been with technology. The machine does some of the stuff I used to do so now I do some different stuff.
It's just much more visually interesting than a page full of perfect burgers. Each one looks like a unique thing from the real world; they don't "look AI", as the kids say these days.
Predictability matters. The whole point of paying someone else to handle a problem for you is that you don't have to worry about it. If you go all in on a provider and then suddenly find out that you've been switched to a paid plan in the middle of your vacation, that's not a place anyone wants to be. Saying there's no lock-in is nice, but that overlooks the fact that there most definitely is friction. What if there's no mass export? No mass import? Or you need to reset 2FA? Or etc, there's a thousand things that can shoot you in the foot, especially if you have a lot of services you need to migrate.
It's impossible to generalize over free vs paid in regard to predictability. E.g. a provider I paid for simply disappeared once when I was quite busy while my old free gmail still works. Realistically CF's free tier is more predictable than many paid options on market.
My threat model here focuses on what the provider gets out of the free tier. Cloudflare gets a broad view into activity on the internet for building the models they use for their paid offerings. Free Gmail puts people on a path in to Google's ecosystem with basically zero marginal cost.
>What if there's no mass export? No mass import? Or you need to reset 2FA?
1. For DNS we have standardized AXFR requests which the DNS provider needs to support as they are part of the DNS standard. There is not an option of not having that unless you have a really shitty provider that you should change anyway.
2. Same for Mass Import because again DNS already defines these things at the protocol level.
And resetting 2FA or whatever is just the cost of using any service
Personally I have used CF for ~10 years so I have saved $240 and I simultaneously use GitHub Pages and CF Pages for CDN because again I just need to give them a bunch of static files. Adding a third CDN provider would literally be a single command at the end of my build pipeline.
For personal projects, I'd rather just pay $2/month and not think about it than get hit with a random bill and scramble to migrate before the next month's bill. Bunny is perfect for this use case where you have a handful of projects that aren't all actively maintained. It just works without hand-holding, and since you're paying for the service, there's no rugpull looming.
The biggest bill I've gotten from Bunny was like $10 when my app (https://atlasof.space) briefly went viral and got 100k+ views in a month. Bunny CDN is so reasonably priced and the realistic visitor ceiling for my projects is low enough that it's still negligible. The free->paid cliff is typically a lot steeper than this in my experience.
> In order to keep your service online, you are required to keep a positive account credit balance. If your account balance drops low, our system will automatically send multiple warning emails. If despite that, you still fail to recharge your account, the system will automatically suspend your account and all your pull zones. Any data in your storage zones will also be deleted after a few days without a backup. Therefore, always make sure to keep your account in good standing.
You proactively replenish your balance, so in the worst case, you can just let the account go.
I used to handwave cloud portability. Turns out when you're shipping things and need extra services and you have deadlines, you build against the platform. I think the GP comment was probably expressing wariness of the free cloudflare tier that entices you to build against their APIs and their product shape in a way that inevitably locks you in. Sure, you could migrate, but that's expensive.
Yeah, good point. For a little hobbyist site of no importance, I'm not too worried about vendor lock-in, but that calculus changes as it gets more important.
That's the catch though. By time you're scaling, there's tension between roadway and revenue and headcount and it's the worst worst possible time to need to reachitect.
I didn't downvote it, but I don't think migrating away from Cloudflare workers, R2, D1, etc., isn't going to be that easy. Basically, the build these things from the ground up to work optimally for their infra - even the mental model that you have to use is different. If you only narrowly use one part of it, maybe.
I never saw that bug, I don't think, but there was one where it had to start the response before you switched away. That's thankfully been fixed for a few weeks.
Bummer of an experience. About a month ago I bought an unlocked 8a from a seller on Swappa and had no issue getting GrapheneOS (which works well!) on it.
It'd be interesting to prompt it to do the same job but try to be innovative.
To your point, yeah, I mostly don't want AI to be innovative unless I'm asking for it to be. In fact, I spend much more time asking it "is that a conventional/idiomatic choice?" (usually when I'm working on a platform I'm not super experienced with) than I do saying "hey, be more innovative."
Yeah, I'd love to find time to. But e.g. I think that is also a "later stage". If you want to come up with novel optimizations, for example, it's better to start with a working but simple compiler, so it can focus on a single improvement. Trying to innovate on every aspect of a compiler from scratch is an easy way of getting yourself into a quagmire that it takes ages to get out of as a human as well.
E.g. the Claude compiler uses SSA because that is what it was directed to use, and that's fine. Following up by getting it to implement a set of the conventional optimizations, and then asking it to research novel alternatives to SSA that allows restarting the existing optimizations and additional optimisations and showing it can get better results or simpler code, for example, would be a really interesting test that might be possible to judge objectively enough (e.g. code complexity metrics vs. benchmarked performance), though validating correctness of the produced code gets a bit thorny (but the same approach of compiling major existing projects that have good test suite is a good start).
If I had unlimited tokens, this is a project I'd love to do. As it is, I need to prioritise my projects, as I can hit the most expensive Claude plans subscription limits every week with any of 5+ projects of mine...
Yes. I'll go down a wrong path in 20 minutes that'd have taken me half a day to go down by hand, and I keep having to remind myself that code is cheap now (and the robot doesn't get tired) so it's best to throw it away and spend 10 more minutes and get it right.
> No install fuss — download and start designing immediately.
also
> Gatekeeper blocks the app immediately. You'll see either "TUIStudio cannot be opened because it is from an unidentified developer" or "TUIStudio is damaged and can't be opened" on newer macOS after quarantine flags the binary.
To get past it: right-click the .app → Open → Open anyway — or go to System Settings → Privacy & Security → "Open Anyway".
It'd be interesting (earnestly!) to see someone make a solid case for AI reimplementation being bad but that the original (afaik) "clean room" project, Compaq's reimplementation of IBM's PC BIOS (something most people seem to see as a righteous move toward openness and freedom), was good.
reply