Hacker Newsnew | past | comments | ask | show | jobs | submit | najarvg's commentslogin

That's what I thought too. It's written in a curmudgeonly style for curmudgeonly readers similar to the author's other posts on substack. Don't think the author made this for very broad reading or expected this to get on HN lol


This is a great effort and thanks for this. Any reasons why the NBC article was linked rather than the tool itself? I found the tool link inside the article but was just curious why the link to the article and whether it would violate any HN terms to paste a link to the tool in the comments


Pocket Pal is what I've seen used before. Although recently heard about "Off Grid" but not read any reviews about it or tried it personally so caveat emptor. Will see if the community has other suggestions


Thanks for sharing the link to your instance. Was blazing fast in responding. Tried throwing a few things at it with the following results: 1. Generating an R script to take a city and country name and finding it's lat/long and mapping it using ggmaps. Generated a pretty decent script (could be more optimal but impressive for the model size) with warnings about using geojson if possible 2. Generate a latex script to display the gaussian integral equation - generated a (I think) non-standard version using probability distribution functions instead of the general version but still give it points for that. Gave explanations of the formula, parameters as well as instructions on how to compile the script using BASH etc 3. Generate a latex script to display the euler identity equation - this one it nailed.

Strongly agree that the knowledge density is impressive for the being a 1-bit model with such a small size and blazing fast response


> Was blazing fast in responding.

I should note this is running on an RTX 6000 pro, so it's probably at the max speed you'll get for "consumer" hardware.


consumer hardware?

That... pft. Nevermind, I'm just jealous


Look it was my present to myself after the Figma IPO (worked there 5 years). If you want to feel less jealous, look at the stock price since then.


Well in this context it's a 5090 with extra unused memory.


Holy hell ... that's a monster of a card


I must add that I also tried out the standard "should I walk or drive to the carwash 100 meters away for washing the car" and it made usual error or suggesting a walk given the distance and health reasons etc. But then this does not claim to be a reasoning model and I did not expect, in the remotest case, for this to be answered correctly. Ever previous generation larger reasoning models struggle with this


I ran it through a rudimentary thinking harness, and it still failed, fwiw:

    The question is about the best mode of transportation to a car wash located 100 meters away. Since the user is asking for a recommendation, it's important to consider practical factors like distance, time, and convenience.

    Walking is the most convenient and eco-friendly option, especially if the car wash is within a short distance. It avoids the need for any transportation and is ideal for quick errands.
    Driving is also an option, but it involves the time and effort of starting and stopping the car, parking, and navigating to the location.
    Given the proximity of the car wash (100 meters), walking is the most practical and efficient choice. If the user has a preference or if the distance is longer, they can adjust accordingly.


And to be fair, you asked about traveling to a location. It just so happens that location is a car wash. You didn't say anything about wanting to wash the car; that's an inference on your part. A reasonable inference based on human experience, sure, but still an inference. You could just as easily want to go to the car wash because that's where you work, or you are meeting somebody there.


Honestly, the fact that we have models that can coherently reason about this problem at all is a technological miracle. And to have it runnable in a 1.15GB memory footprint? Is insanity.


Exactly. It's not that the pig dances poorly, or that the dog's stock tips never seem to pan out. It's the fact that it's happening at all.


But the fact that we have convinced a pig to dance, and trained a dog to provide stock tips? That can be improved upon over time. We've gotten here, haven't we? It really is a miracle, and I'll stick to that opinion.


I ran into this too. I'm on a work computer so did not want to accept this without knowing


This is a great project! Is there a place to look up the list of inbuilt lexers to understand the editor language support? Will forward to my (more hands-on devs) team members.


Lexing is handled by one of Mitchell's other projects, Scintillua. You'll find the source for all the built-in lexers in there. https://github.com/orbitalquark/scintillua

The documentation for Scintillua also gets pulled into Textadept's API documentation as a dependency, so the syntax is also explained there. It's basically a bridge between Scintilla's native lexing and LPeg.


Grandpa rats shaking their paws in anger going "It's all of 'em darned video games spoiling this generation of rats! Back in my days..."


If you add in the 1000$ that treasury plans to invest starting next year, that is $1250 compounded at 5% annually after 18 years to $3008.27. It's probably still not a "head start" given that inflation is assumed to nominally rise at 2.5 to 3.5% annually and will take a bite out of what the real value is worth in 18 years. Good intentions but misplaced as others have stated. Investing in other ways to provide upward economic mobility will provide much better ROI for the society than allowing most of the wealth to accrue to a handful of people


> If you add in the 1000$ that treasury plans to invest starting next year, that is $1250

This is largely separate from your point, which is good, but the $250 is for kids that won't get the $1000. The $1000 only goes to kids born between 2025 and 2028.


Real quick, the $1000 530A account, if you put in just $1/day, $30/mo, on top of that account, then you get out ~$12,000 at the end of 30 years (assuming 5% interest rate). Which, yeah, that's enough to start a very small business (lawncare, blacksmithing, etc).

The stock market is at ~9.5% returns historically, inflation is likely at ~3% historically, so assume a little higher at 6.5% and that $1000 with a dollar a day increase is then ~$14,800, inflation adjusted.

If you go up to ~$100/mo at 6.5%, then you get ~$42,000, which is an honest start to a small business or college tuition.

The little extra per month really adds up here!

I may not like the administration for a lot of things, but this is one thing that I can really get behind.


Fascinating. Thanks for sharing! Sometime back I had run into a related experiment where the author setup a simple 1 layer NN with a shift-register feedback and explored the state space of neuron activations over large iterations. The observation was beautiful in that the state space maps traced out attractors. See here if you are curious - https://towardsdatascience.com/attractors-in-neural-network-...


Also beautiful. Thank you for sharing it on HN.


On the fly 3D printing gun that uses a biocompatible thermoplastic to heal complex bone fractures


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: