So, I don't know if this is AI generated or whether the author is actually unaware, but Atari cartridges and floppies commonly had copy protections. My uncle was active in the scene at the time, and as an electrical engineer came up with a solution. When I inherited his Atari 800 in the 90s there was a physical button wired into the floppy drive which would force a bad sector onto the disk as it was being written. He had notebooks about the timing for these bad sectors per game.
So, yeah. The "article" is incorrect from nearly the get-go about the "wild west" Atari age.
No, it is not AI generated. It was based on my research.
I think there is a mix-up here between Atari home consoles and Atari home computers.
In that section I was talking about early console platforms such as the Atari 2600, where the cartridge interface itself had no lockout/authentication mechanism comparable to what Nintendo later did with the 10NES. That is why third-party cartridges could exist and Atari’s main response was legal rather than technical.
What you describe for the Atari 800 is real, but it belongs to a different context: the Atari 8-bit computer line, especially floppy-disk software, where copy-protection tricks such as intentional bad sectors and timing-based checks were indeed common.
So I agree that Atari computer software often used copy protection, but that does not contradict the point I was making about the early console era.
Hi, quick note on "For modern Xbox platforms, public 2024 work exposed SystemOS kernel exploitation on both Xbox One and Xbox Series"
I'm a former Xbox hacker, then former Microsoft employee, and (long after) leaving Microsoft helped with the Collateral Damage post-exploitation payload.
The design of the Xbox One security predates me, but Microsoft has always known that SystemOS would be a weak link that would almost guaranteed to be compromised and shoved most of their attack surface that can be trivially attacked in there. The system shell, 3rd-party apps, guide, etc. all run in SystemOS.
The key things they focused on though were:
1. Extremely strong defense-in-depth
2. Making full or partial exploitation not economical
3rd party apps and the web browser were seen as being obviously untrusted _and_ needed JIT because they'd mostly be based on .NET or the JS VM. But practically speaking there should be nothing interesting in that VM: its compromise shouldn't enable piracy/cheating and ideally shouldn't leak game plaintext.
What some others found though was that for some reason plaintext was actually visible to SystemOS, but didn't enable piracy on console. You can take those games though and run them on PC using XWine1: https://github.com/xwine1
Technically speaking there's no reason why Collateral Damage couldn't have happened waayyyyy earlier in the Xbox One's lifecycle except for motivation. Even still you could probably take some Hyper-V N-day and compromise HostOS through.
Over there years there have been other "exploits" too: some folks have managed to tamper with gamesaves via cloud connected storage and other shenanigans, XSS in the system shell (some of these apps are JS), etc., but most of this was relatively benign and easily patchable. And there has been a very, very small group of people with similar but less capable exploits to Collat.
I find it interesting that all the way back in 1985, in Atari vs NES, we had proof that consumers preferred walled gardens. The walled garden exploded from a completely dead market, while the already-existing open system killed itself. Apple proceeded to make a killing of their own on this reality, Microsoft invented a pseudo-walled garden that has become a technical dead end, while FOSS communities are still in denial about how things shouldn't be that way rather than accepting reality and inventing their own curated experience with enforced rules.
I disagree, it wasn't about consumers, but rather other businesses. The walled garden approach Nintendo took in America was needed to convince retailers to stock video games on store shelves again. And of course the Famicom didn't have that same approach, and while Nintendo hated the fact third parties could easily make Famicom carts, the open nature of the system certainly didn't hurt it in Japan.
I think consumers chose quality and convenience. It just so happens that the walled garden is the easiest way to accomplish this. Electronics, especially computers, were extremely expensive back then. I can't blame people for buying a console that just works. Compatibility was an issue well into the late 90s because so many people didn't know how computers worked.
Windows is an open platform for developers... if you ignore all of the security checks and Windows Defender and the stagnant platform which is about 2 decades behind everyone else, across the board, in terms of native tooling (e.g. which UI framework should I use and is it good?).
However, Windows also has many, many, walled garden things bolted onto it. You aren't distributing your own drivers without Microsoft's approval. You aren't running Microsoft Office on Wine. You aren't connecting to Active Directory without Microsoft's blessing. You aren't making group policies that work on Linux for MDM. You aren't manufacturing Windows devices, at all, unless they meet Microsoft's system requirements and mandates (e.g. a Windows icon on the keyboard). Your BIOS must follow strict rules about where the activation key is fused. Etc.
In that respect, Windows is only open from an end user perspective. In all other respects, it is closed, and it is closed tightly.
> You aren't distributing your own drivers without Microsoft's approval.
Only kernel drivers.
> You aren't connecting to Active Directory without Microsoft's blessing.
I think you're talking about EntraID. That is true enough. You can just spin up Windows Server and create a domain controller, no problem. You don't need Microsoft for domain services, though - you can use other domain controller types. (You don't get GPO and other things - that's not a 'walled garden' thing, that's a feature set which other systems don't have)
> In that respect, Windows is only open from an end user perspective. In all other respects, it is closed, and it is closed tightly.
Not so tight as you seem to think. And anyways, I was specifically referring to building windows apps - which you did not disagree with. You absolutely can pull down various free tools, build an app, package it up as a .zip or .msi and distribute it from a variety of places. The Windows app store is a walled garden, but you don't have to use it.
I've tried a few models and some are decent, including Qwens models. I've tried a few harnesses like Roo Code in VSCode to put things together that in theory emulate the experience I get from VSCode + Claude or Copilot, but I generally find the experience extremely limited and frustrating.
How have you set things up to have a good experience?
I have to disagree. LLMs have shown that the only way to participate in the new software ecosystem are through leveraging an extremely powerful position that is create, backed, and maintained through the exploitation of capital, labor, and power (political, legal, corpotate) at levels never really seen before. The model of the Cathedral and the Bazaar was not broken by LLMs but instead the entire ecosystem was changed.
Now the software doesn't matter. The code doesn't matter. The hardware doesn't matter. Anyone can generate anything for anything, as long as they pay the fee. I think it can likely be argued that participation is now gated more than ever and will require usage of an LLM to keep up and maintain some kind of competition or even meager parity. Open weight models are not really a means of crossing the moat; none of the open weight models come close to the functionality, and all of them come from the same types of corporations that are releasing their models for unspecified reasons. The fact remains that the moat created by LLMs for open source software has never been larger.
The OpenTelemetry spec is absolutely what folks have been waiting for for as long as I've been in computing (~20 years). A single standard that is implemented in nearly every popular language with very close feature parity. It's honestly wonderful to work with compared to the old vendor supplied frameworks.
I took it upon myself to write a library for my current employer (4yrs ago now?) that abstracted and standardized the way our Rust services instantiated and utilized the metrics and tracing fundamentals that OpenTelemetry provides. I recently added OTLP logging (technically using tracing events) to allow for forwarding baggage / context / metadata with the log lines. The `tracing` crate in rust also has a macro called `instrument` that allows you to mostly auto-instrument your functions for tracing, allowing the tracing context to be extracted and propagated into your function so the trace / span can be added to subsequent HTTP / gRPC requests.
We did all kinds of other stuff too, like adding a method for attaching the trace-id to our kafka messages so we can see how long the entire lifetime of the request takes (including sitting on the queue). It's been extremely insightful.
Signoz is newer to the game. I'm glad there are more competitors and vendors using OpenTelemetry natively. We originally talked to some of the big vendors and they were going to gladly accept OpenTelemetry, but they marked every metric as a "custom" metric and would charge out the wazoo for each of them, far in excess of whatever was instrumented natively with their APM plugin thingamabob.
The more the better. I love OpenTelemetry, and using it in Rust has been mostly great.
That library you built sounds great. The kind of things that I love to read the code of, if I'm using it in a project.
I was divided between adding instrument macro, but decided on manual instrumentation for the demonstration.
Regarding monitoring Kakfa execution times, absolutely agreed.
In my previous job, monitoring Celery had helped us understand consumer bottlenecks, because we couldn't see background job traces containing the celery consumer spans.
And when they did appear, they were hours late. So the entire trace took 8 hours instead of the expected couple minutes.
This is your opinion. I do not share your opinion. The occult is a wide range of topics and practices, generally split (but not cleanly) into theurgic and thaumaturgic activities. That is, manifestation of the three common desires (wealth, power, love / sex etc.), and then deification and approaching and sometimes joining with / uniting with God. Occult meaning, hidden.
If you read many of the grimoires, there is very little NLP of any kind. The Papyri Graecae Magicae is one of the oldest explicitly magical documents we have from Greek Egypt, and it does have some manipulation spells (as most magical documents do) but none of this has to do with coersion to join a religion or join in a war, or to "do bad stuff". It's largely "technology" used by a practicing magician (a moonlighting Egyptian priest) to help the laity deal with their daily lives regarding helping their crops grow, animals not get sick, healing sick children, getting revenge on their neighbors and former lovers etc.
Magic is always a tool in the hands of the oppressed as a response to tyrannical hierarchy.
From the very beginning of my tenure at my current "start-up" I wrote a Rust bespoke implementation using the base OpenTelemetry library with lots of opinionated defaults and company specifics. We integrated this early on in our microservice development, and it's been an absolute game changer. All of our services include the library and use a simple boilerplate macro to include metrics and tracing into our Actix and Tonic servers, Tonic client, etc. Logs are slurped off Kubernetes pods using promtail.
It was easy enough that I, as a single SRE (at the time) could write and implement across dozens of services in a few months of part-time work while handling all my other normal duties. OpenTelemetry has proved to be worth the investment, and we have stayed within the Grafana ecosystem, now paying for Grafana Cloud (to save our time on maintaining the stack in our Kubernetes clusters).
I would absolutely recommend it. I would recommend it and hopefully use it at any new future positions.
I adore Justin Sledge, despite putting me to sleep quite a few times. If you want to learn a bit about the religious side of the esoteric (mysticism) I highly recommend Filip Holm https://www.youtube.com/@LetsTalkReligion
There are actually some fairly high quality translations of things nowadays. Much better than even 20 years ago. If you are still inclined to the occult, now is a better time than pretty much ever.
Science doesn't negate an interest in the arcane, esoteric, or occult. You can still find this stuff fascinating, and in fact there are practitioners who are actively involved in scientific circles simultaneously. It is not always mutually exclusive.
I’d say it goes beyond not being mutually exclusive. They complement each other, sometimes in surprising ways. Sacred geometry, concepts of frequency and vibrational rates, extracting signal from noise, if you are well versed in math and science you’ll find a lot of synchronicities. Fourier analysis dovetails with the concept of unity.
Pythagoras was what we might call an occultist. Newton was an alchemist (which isn’t about lead to gold, it’s about the transmutation of the Self), Jack Parsons was a Thelemite. Ramanujan credited his genius to visions.
Science and math can’t (yet) answer the big questions. There are things it doesn’t even try and touch. In my experience, curious minds are often interested in trying to attain a broader understanding of the universe and our place in it.
"Newton was an alchemist (which isn’t about lead to gold, it’s about the transmutation of the Self)"
Well, maybe not so much. That's kind of a 19th-20th century interpretation. We didn't want to believe that all these smart people really were into stupidity like turning lead into gold. Surely it must be much deeper than that! It must have been metaphors! But maybe not. Maybe they literally were into what they said they were into. It's not unlike how people want to claim that various religious stories weren't "really" about what they claim to be.
> We didn't want to believe that all these smart people really were into stupidity like turning lead into gold.
Alchemy was not stupid in the 17th century. You have the benefit of three centuries of subsequent scientific advances, to which geniuses like Isaac Newton, and those other smart people, contributed significantly.
Besides alchemy, Newton was deeply immersed in various occult studies. He was also a heretic, being a Unitarian, keeping his religious beliefs secret. Scientific research occupied only a part of his time. The seventeenth century was a time of religious and political turmoil, millenarianism and apocalyptic prophecy abounded. Newton was a man of his time.
Respectfully, Zosimus is one of the earliest Hellenistic writers on alchemy and he speaks of chemistry as a symbol:
“There are two sciences and two wisdoms, that of the Egyptians and that of the Hebrews, which latter is confirmed by divine justice. The science and wisdom of the most excellent dominate the one and the other. Both originate in olden times. Their origin is without a king, autonomous and immaterial; it is not concerned with material and corruptible bodies, it operates, without submitting to strange influences, supported by prayer and divine grace.
The symbol of chemistry is drawn from the creation by its adepts, who cleanse and save the divine soul bound in the elements, and who free the divine spirit from its mixture with the flesh.“
On the other hand we really can understand the chemistry that alchemists were fiddling with -- it wasn't metaphorical -- they really were messing around with chemicals and not souls. We still call some things by the names alchemists called them like "aqua regia" (literally "royal water") which is a nixture of nitric and hydrochloric acid that can dissolve gold and platinum. And which they hoped could therefore make more of it.
That to me is one of the most interesting aspects. Somehow, these people who were deeply spiritual, also were adepts of science, and while we can’t say any of them got it exactly right, the paths intersected enough that their contributions were in some ways foundational.
Psychology and psychiatry are two other fields that traveled the path of spirituality and occultism before becoming what we now term modern.
I grow heirloom tomatoes in the United States, and they are spectacular. The tomatoes available in the grocery store pale in comparison to anything grown in your own garden.
This probably has more to do with the fact that supermarket tomatoes are picked before they fully ripen on the vine (often green). I also enjoy growing my own (Brandywine is a favorite), and letting it ripen on-vine to the point where it's probably hours away from falling off or starting to rot results in a much better tasting tomato than jumping the gun.
I have to completely disagree. Application level metrics can also be emitted at the log level, but are much easier to work with as a metric. We have total request count, average request latency per route, and per HTTP response. This is extremely useful for finding performance regressions when new code is released.
So, yeah. The "article" is incorrect from nearly the get-go about the "wild west" Atari age.
reply