Among the hundreds of thousands of lines of code that Anthropic produced was one that leaked the source code. It is likely to be a config file, not part of the Claude Code software itself, but it still something to track.
The more lines of code you have the more likely there is for one of them to be wrong and go unnoticed. It results in bugs, vulnerabilities,... and leaks.
Drinking alcohol is probably way worse, but you can choose to not drink, you can't choose to live a normal life and not get microplastics.
Also, alcohol has existed since forever and humans have been drinking it since the beginning of civilization. We have a pretty good idea of what it does and how to keep it under control. Microplastics are a recent thing, it may be a dud, but it may be a serious problem for future generations, so keeping an eye on them is a good thing.
Which might be the correct answer! Something that's extremely hard to undo should have us much more worried than keeping an eye. We should have tons of research projects running on this.
Soviet engineering wasn't sloppy. It was designed for robustness, loose tolerances and simplicity. It was well thought out design. In the same way that as much thought went into the cheap alarm clock than went into the Rolex watch, maybe even more so, the engineers just had different requirements.
It takes a lot of work to make cheap, low precision parts work together reliably. The Rolex has it easy, all the parts are precisely built at a great cost and everything fits perfectly. With the cheap alarm clock, you don't know what you will get, so you have to account for every possible defect, because you won't get anything better with your budget and the clock still needs to give you an idea about what time it is.
The parallel in software would be defensive programming, fault tolerance, etc... Ironically, that's common practices in critical software, and it is the most expensive kind of software to develop, the opposite of slop.
There's a narrative that gets passed around in physics circles about how the Soviets were better at finding creative and analytical solutions than Americans, because of the relative scarcity of computing versus intellectual labour resources.
It would make sense to me that a parallel mechanism could apply to Soviet engineering. If material and technologically advanced capital are scarce, but engineers are abundant, you would naturally spend more time doing proper engineering, which means figuring out how to squeeze the most out of what you have available.
The problem is more about how it is reported to the public. Science is ugly, but when a discovery is announced to the public, a high level of confidence is expected, and journalists certainly act like there is. Kind of like you are not supposed to ship untested development versions of software to customers.
But sometimes, some of the ugly science gets out of the lab a bit too soon, and it usually doesn't end well. Usually people get their hopes up, and when it doesn't live up to the hype, people get confused.
It really stood out during the covid pandemic. We didn't have time to wait for the long trials we normally expect, waiting could mean thousands of deaths, and we had to make do with uncertainty. That's how we got all sorts of conflicting information and policies that changed all the time. The virus spread by contact, no, it is airborne, masks, no masks, hydroxycholoroquine, no, that's bullshit, etc... that sort of thing. That's the kind of thing that usually don't get publicized outside of scientific papers, but the circumstances made it so that everyone got to see that, including science deniers unfortunately.
Edit: Still, I really enjoyed the LK99 saga (the supposed room temperature superconductor). It was overhyped, and it it came to its expected conclusion (it isn't), however, it sparked widespread interest in semiconductors and plenty of replication attempts.
> The problem is more about how it is reported to the public.
Yes and no.
From scientific communicators there's a lot of slop and it's getting worse. Even places like Nature and Scientific American are making unacceptable mistakes (a famous one being the quantum machine learning black hole BS that Quanta published)
But I frequently see those HN comments on ArXiv links. That's not a science communication issue. Those are papers. That's researcher to researcher communication. It's open, but not written for the public. People will argue it should be, but then where does researcher to researcher communication happen? You really want that behind closed doors?
There is a certain arrogance that plays a role. Small sample size? There's a good chance it's a paper arguing for the community to study at a larger scale. You're not going to start out by recruiting a million people to figure out if an effect might even exist. Yet I see those papers routinely scoffed at. They're scientifically sound but laughing at them is as big of an error as treating them like absolute truth, just erring in the opposite direction.
People really do not understand how science works and they get extremely upset if you suggest otherwise. As if not understanding something that they haven't spent decades studying implies they're dumb. Scientists don't expect non scientists to understand how science works. There's a reason you're only a junior scientist after getting an entire PhD. You can be smart and not understand tons of stuff. I got a PhD and I'll happily say I'll look like a bumbling idiots even outside my niche, in my own domain! I think we're just got to stop trying to prove how smart we are before we're all dumb as shit. We're just kinda not dumb at some things, and that's perfectly okay. Learning is the interesting part. And it's extra ironic the Less Wrong crowd doesn't take those words to heart because that's what it's all about. We're all wrong. It's not about being right, it's about being less wrong
- *: in the middle (better for things like multiplication), or high (better for things like C pointers)
- Alignment of =, >, - some fonts align -, = and > to that "=>" and "->" look good, others will not, making it arguably look better in isolation, others will optimize for ligatures
- The "i" may look significantly different, some will prioritize consistency, others will prioritize making il1I look distinct. Same idea for 0/O
- Aspect ratio, do you want a wide font, making alignment, indentation, and special characters clearer, or a narrow font, allowing you to cram longer lines into a single screen.
These are compromises, and depending on your style and language, you may prefer one or the other.
> Theft from the outside world, however, is often taken lightly - especially when it comes to graphics.
One should not forget where the demoscene is coming from: crackers. The whole point of "intros" was to show off the skills of whoever cracked a piece of software. So obviously, the views demoscene held on intellectual property are not mainstream, if we can say it like that.
The shift to a more creative and law abiding art scene, led by adults and not rebellious teenagers is more recent development.
I think very initially it was indeed so, crackers were the ones doing the intros. But very quickly the efforts got split, most of the time, the person doing the intro was a different person, the person doing the music was another person and finally another person was the one doing the actual crack. I don't think it took very long for this split to be the norm, even though very early I'm sure there was individuals doing all three pieces alone.
Yes, of course, I'm wasn't trying to claim the music/graphics was stolen by the cracker, or vice-versa, just that "show off the skills of whoever cracked a piece of software" isn't really accurately representing how the team's composition was, since they were different people.
exactly, I got into the scene in the mid 90s and none of us was into cracking at all. I mean, if you have the skills to code a demo you could probably crack most software, but at that point it was about taking a hardware and making it do what most thought wasn't possible.
And what exactly do you think "mainstream" opinion on intellectual is? To me it seams most people are fine with ignoring it unless they are the one profiting.
They had pretty good results post WW2. The problem is that they ended up lagging behind the western bloc because of a lack of resources and innovation. Basic healthcare doesn't mean much if you don't have good treatment in the first place. It is a common problem with communist countries, they usually have good access to healthcare, but they don't have the resources to give proper treatment.
I hate this argument. Every time there is some big and expensive technical achievement, someone is going to say that the poor are dying somewhere in the world. As if not going to the moon would have saved them.
I would argue that a healthy population is what allows great things like Apollo to happen. For such a program to succeed, we need lots of highly skilled people. Scientists, engineers, astronauts, tradesmen, managers, etc... Everyone needs to be at the top of their game. Such talent doesn't develop when you are struggling for your life, you need good conditions like health, confort and stability to be able to focus on your craft.
If we use life expectancy as a proxy, we could say that the US had a healthier population during the cold war than the USSR, and they are the ones who succeeded on the most ambitious project in the space race, despite the USSR having a head start. To me, it is not a coincidence.
Also, the cold war era was not just about space, it is also a time of major advance when it comes to medicine, life expectancy has seen a dramatic improvement, so we can put men on the moon and keep a population heathy.
An awful lot of stuff in the "hand made" aesthetic are made by machine and factory too, and I suspect a similar thing will happen to any popular writing aesthetic that attempts to avoid being automated away.
Personally, I'll just continue to use my own voice. I try to correct spelling and grammar mistakes, and proof-read my writing before posting.
It's not perfect, and my writing can at times be idiosyncratic, but it's my voice and it's all I've got left.
But don't be mistaken in thinking that those mistakes make it better, it just makes it mine.
The more lines of code you have the more likely there is for one of them to be wrong and go unnoticed. It results in bugs, vulnerabilities,... and leaks.
reply