For the Canadians sitting at home, tut tutting more American foolishness that could never happen up here... Flock started their expansion into Ontario this very month[1].
"There is a final irony that deserves attention. If the doomers truly hold their stated beliefs at their stated confidence levels, they should be more honest about what those beliefs imply. A few weeks before the attack, a journalist asked Yudkowsky: if AI is so dangerous, why aren't you attacking data centers? His answer, relayed by Soares: "If you saw a headline saying I'd done that, would you say, 'wow, AI has been stopped, we're safe'? If not, you already know it wouldn't be effective."
----------
There are several thousand AI data centres in the U.S. alone, and hundreds are over a thousand square meters in floor space. Think about the physical effort it would take to reliably destroy, beyond the possibility of repair, just one typical computer in your home. Now multiply that out to thousands of server racks. Even if the employees rolled out the red carpet for you and handed you a baseball bat, you wouldn't get very far. Next, consider that these data centres are popping up all over the world in the most unlikely and remote locations. They don't need workers. They just need power, water, and, preferably, lax tax and environmental standards.
Doomers are attacking billionaires because they perceive them to be the soft, meaty, weak-points of a gigantic inhuman machine. They believe that just scaring Sam Altman a little will have a huge impact compared to trying to attack a data centre. However, billionaires can afford pretty decent security. This doomer movement probably isn't going to accomplish much until they target the engineers and support staff that surround billionaires. Billionaires don't scare easily because they have so much protection, but the poorly paid and poorly secured people around them are another story.
Poorly secured means easy to coerce with a stick. Poorly paid means easy to coerce with a carrot. The threat doomers pose is relatively small until they start turning employees against their own companies. What's an activist with a baseball bat compared to an employee who knows how to disable every computer in multiple data centres simultaneously?
The solution to this kind of problem is standards.
For most of the history of computation, things were moving too fast for anyone to really worry about standardization. Computing environments were also somewhat Balkanized. Standard keyboard shortcuts, for just one example, weren't. They still aren't. e.g. If you fingers are accustomed to hitting Ctrl-C to copy on most computers, they'll hit Fn-C on a Apple keyboard, which isn't Copy.
Today, things are moving slower and web interfaces have largely taken over. Your choice of OS mostly just affects how you get into a browser or some other cross-platform program... and what keys you hit for Copy and Paste.
Now would be a reasonable point in the history of computation for us to seriously consider standards. I'm not talking about licenses, inspectors, and litigation if you get it wrong. I'm just talking about some organization publishing standards that say, "This is how you build a standard login form. These are the features it should have. This is how they should be laid out. These are the icons to use or not use. These are the what keyboard shortcuts should be implemented." The idea is that people who sit down and start building a common bit of interface, instead of picking and choosing others to copy, should have a clear and simple set of standards to follow.
And yes, Apple needs to fix their #$%@ing keyboards.
Isn't the Macintosh desktop (with Cmd as the modifier for standard shortcuts) older than Windows and Linux desktops? So historically, it's not Apple that deviated but the others?
(I did not do an extensive search into this, so there might be Ctrl-based standard shortcuts that predate Apple.)
Apple moved the Ctrl key around at least a couple times. On the Apple II it was next to the A key, the same as it was on the Xerox Star. The CMD key was a later addition.
At this point, I'd say let history be history. It'd be better to standardize on what most people are using.
When a regime starts killing thousands of it's own people it's a sign of weakness, not strength. Iran's theocracy was teetering above the abyss before the U.S. started bombing them.
Now, they're probably good to go for a couple more decades. Trump is precisely the kind of threat Iranians have been warned about since the revolution. When a regime spends almost half a century preparing for something and it finally happens, it earns them considerable forgiveness. Also, nothing unites people quite like a foreign threat, especially one dumb enough to bomb schoolgirls in its opening salvo.
By scuttling the JCPOA for no apparent reason and now invading Iran right when it appeared the regime was crumbling, Trump has single-handedly reinvigorated Iran's theocracy and given them the public support they need for the final push towards nuclear weapons. That's what's so sickening about this invasion. It has acted in diametric opposition to the the policy goals it was purportedly pursuing.
“The bad news is that we have not reached an agreement, and I think that’s bad news for Iran much more than it’s bad news for the United States of America,” Vance said.
“So we go back to the United States having not come to an agreement. We’ve made very clear what our red lines are.”
It was clear the U.S. was not serious about these negotiations when they sent Vance. It's also clear the U.S. doesn't have the cards to end this conflict by force. They can use drones to clear the straight of Hormuz of mines, but that won't address all the other methods Iran has to threaten shipping. Any military measure short of the full occupation of Iran will likely fail to reopen the straight. The U.S. plainly lacks the resources to occupy a country four times the size of Iraq without allies, and the Iranians know it. The U.S. is going to have to bend on some of its red lines and actually negotiate in order to reach a deal.
Many countries are standing back and waiting for the Americans to fix their own mess, but for how long will they wait? At what point do these nations lose patience with the constant economic disruption and look for coercive measures to force the U.S. back to the table?
It was Iran's demand that they will not speak to Witkoff or Kushner, who were the original morons in this fiasco. They wanted only Vance on the table, most likely because he was against this war and has kept himself away from the whole thing.
> They can use drones to clear the straight of Hormuz of mines, but that won't address all the other methods Iran has to threaten shipping
Iran does not have to even mine or bomb the strait. Them just declaring that they will hit is enough to stop traffic.
> Any military measure short of the full occupation of Iran will likely fail to reopen the straight
I highly doubt even this. Iranian drones have a range of about 1000 km. They can continue to block the strait, even with a ground force. Not to mention that ground forces blitzing through the whole territory will take at-least a year, if not more. That is enough time to plunge the whole world into a recession.
> At what point do these nations lose patience with the constant economic disruption and look for coercive measures to force the U.S. back to the table?
Most nations cannot coerce the US, at least not Trump. What they will most likely do is have secret or open deals with Iran to let their oil through, with a toll tax of course.
Marco Rubio, who was unanimously confirmed by the Senate with a pretty explicit expectation that he would be the adult in the room for this kind of crisis. But he was too busy watching UFC with the President to attend or even monitor the negotiations.
Crypto-miners are switching to AI token farming when bitcoin is low. They have compute that's both installed and powered, so why not do what pays better?
Training ASICs (like Google’s TPUs) can generally run inference too, since inference is a subset of training computations. TPUs are widely used for both.
Mining ASICs (Bitcoin, etc.) cannot be repurposed…they’re hardwired for a single hash algorithm and lack matrix math needed for neural networks.
There are plenty of laptops out there that have square edges on the user-facing edge. However, most are tapered and/or have hinge designs that tilt the laptop surface towards the user, dropping that square edge away from the user's wrists.
Most Apple laptops, such as the latest Pro's, are level, rather than tapered, and sit flat so that the user-facing edge cuts into your wrists. It's bad ergonomics, plain and simple. If you value function over form enough to modify your tools in this way, choose better tools.
Choosing tools is not easy. Last time I bought a laptop was 2014. My goal was running Linux. My other requirements were, in order of importance, without explanation:
3 physical buttons below the touchpad. That removed really many laptops. They would be nearly zero today, or really zero.
15 inches screen. Common.
Matte finish. Common.
User serviceable hardware. That removed many other laptops.
No number pad. I had to give up on that or I would have no laptop to buy.
"The third factor was the deaths of 25 chimpanzees, including four adult males and 10 adult females, as a result of a respiratory epidemic, in 2017, a year before the final separation. One of the adult males who died was "among the last individuals to connect the groups", the research paper said."
-------------
There's a theory that humans (and likely chimps as well) have a cognitive upper limit to the number of stable relationships they can maintain (i.e. Dunbar's number[1]). Also, there is the idea that most people have nowhere near that many relationships, but some people are super connectors. They know everyone in the community and tie it together, even if the average member of the community doesn't know most other people in it.
It almost sounds like, before the conflict, the tribe was at or a little beyond their "Dunbar's number"[1] and then several of their super-connectors died. Suddenly the community, despite its losses, was too big and not connected enough to remain stable. Minor conflicts arose, individuals started choosing sides, and there wasn't anyone with connections to both sides able to bridge the gap and calm things down.
I'm not a sociologist/anthropologist/etc., so I'm probably woefully misinformed and spewing nonsense here. I'd love to hear what someone up to date on this stuff thinks actually happened.
In the recent Mad Max films, Miller used CG for compositing, but insisted that all the action be real. There are no CG people jumping bikes over 16-wheelers. CG was only used to get rid of safety equipment, change the sky, etc.. The results feel viscerally real.
Guitar dude's exploding rig was definitely CG. Don't kid yourself that it was limited to what you stated. Yes, the stunts were real humans, but it also had CG elements
I'm talking about the end of the flamethrower guy when the rig wrecks. There's a bunch of debris that flies around including the steering wheel that perfectly comes at camera spinning exactly times so the center wipes the frame. That sequence has lots of CG
This has tended to be significantly overblown recently with a huge amount of 'no CGI' advertising coming from studios, which often verges into utter BS. There's an incredible amount of CG at every level of modern productions, regardless of how much stuntwork and practical effects were done as well. (this video series has a good breakdown on it, which has included studios releasing doctored 'behind the scenes' footage! https://www.youtube.com/watch?v=7ttG90raCNo ).
That's not to say that doing these things is pointless or unimpressive, but it's often used to denigrate and minimize the work of a lot of already quite underappreciated artists.
"Many respondents did acknowledge that A.I. might make them more efficient in school and the workplace, he said. But they were concerned about how the technology would affect their creativity and critical thinking skills."
-----------------
Perhaps schools need to adapt to AI use and recenter the goals of education in the minds of students. If AI use impairs your development, you are only being efficient in your evasion of education.
i.e. Students need to be taught that learning to efficiently pump out AI written essays isn't the same thing as learning to reason and express themselves. AI tools will evolve and become easier and easier to pick up and use. Using your own mind is a slower and more difficult skill to develop, but it makes the difference between going through life as a human being or a mere meat-puppet for AI. It will always be far easier for a human to pick up AI tools and learn them from scratch than it will for a meat-puppet to remedy their lack of human development.
Underresouced instructors just need to come up with new pedagogies to handle revolutionary new tools that change extremely rapidly and which also provide an extremely effective way for students to cheat.
Probably but how do you adapt to something that changes faster than semesters. Revising your theory of learning, implementing, evaluating results, etc. takes years, not weeks.
The current situation is that many students don't perceive that using AI to produce, for example, essays is harmful to themselves, and students who do things honestly may feel pressure to use AI in order to stay competitive with students who do.
The answer may be to focus less on output and more on the process. e.g. Instead of sending students off to do essays at home and then merely grading what gets handed in, perhaps teachers should run workshops where students work on their essays while receiving guidance. i.e. Everybody works in the classroom on their essay and talks to each other and the teacher about what they're doing. Grades would be at least partly based on participation, and teachers would get a better sense of what students are actually able to write themselves. If Johnny sits back and picks his nose in the workshop and then hands in a paper that's suspiciously good, it's probably slop even if it isn't obviously so.
Of course, doing this sort of thing would mean taking time away from lectures and wrote learning. Finding the right balance is no easy task and it's going to take good teachers to blaze the way. That can only happen if they're backed with resources and the freedom to alter curriculum.
> If Johnny sits back and picks his nose in the workshop and then hands in a paper that's suspiciously good, it's probably slop even if it isn't obviously so.
This is incredibly out of touch. No teacher or even school administrator needs to have that said to them. Students refuse to hear it (despite the bleating of the article). Who are you talking to then? Parents? That's rich
We should probably oppose this.
_________
[1]https://www.theguardian.com/technology/2026/apr/07/toronto-r...
reply