Hacker Newsnew | past | comments | ask | show | jobs | submit | EdNutting's commentslogin

Also 90% of citations generated by AI are wrong or straight up don’t even exist. It’s got such a long way to go to be able to reliably write credible papers.

[Source: https://www.reddit.com/r/AskReddit/comments/o6hlry/statistic... ]


Your source is a 5 year AMA post that it itself claims is made it.

While funny, it does nothing to prove your assertion.


>While funny, it does nothing to prove your assertion.

Unless that citation was generated by AI.


I think you missed the point. Yes it was meant to be humorous, and also to emphasise one of the reasons AI-generated citations are completely untrustworthy, especially with the growing number of AI-generated (junk) papers being published.

No, I had no intention of trying to offer a real source for the accuracy of AI generated citations. It is not hard to Google, search HN or even (ironically) use AI to search, to find numerous relatively recent studies discussing the problem or highlighting specific cases of respected journals/conferences publishing papers with junk citations.


It feels like allowing fake citations in the output from the AI means that you didn't do even the barest minimum of verification (i.e. tell the AI to verify it by sending a new AI to download the pdf that matches that DOI and verifying that it matches what the citation says).

And here we see you’ve hit upon Jevon’s paradox. The scope of work will grow to use more than it did before, now that human labour achieves more for the same money. Employment will ultimately go up not down (over the long term - we are seeing a lot of short term instability and noise, although there’s much said about AI without it yet showing up in the data, as per articles recently shared on HN about employment figures across the US and the world).

10 years from now, the people that stopped hiring novices and juniors are going to be deeply regretting their past decisions. The people that kept hiring are going to be working with their newly-promoted-to-senior colleagues and be making significantly more progress than those that didn’t keep hiring.

(IBM figured this out a couple of months ago, and explicitly announced tripling their hiring of juniors/grads in order to avoid ending up with a massive gap in the management/senior layers in future).

Link? Source?

https://news.ycombinator.com/item?id=47009327

> “The companies three to five years from now that are going to be the most successful are those companies that doubled down on entry-level hiring in this environment,” Nickle LaMoreaux, IBM’s chief human resources officer, said this week.


Will they? Just because you developed them that doesn't guarantee they will stay with you. It's been always the same issue tbh, but big companies could accept the risk because they pay the most competitive salaries anyways.

Ten years from now the people making these decisions have moved on to different companies and cashed their quarterly bonuses.

Except they won't. They will just hire those new people away from the firms that trained them. That's what happens now and there's no reason why it won't happen in the future.

This is why firms that do actual training have clauses written in the employment contract that says if you receive x months of training from them then you have to work for them for at least y number of years otherwise if you leave then you have to pay them for the cost of training you (which is written as a dollar amount in the contract).

Companies that don't have that kind of clause in the contract are going to get screwed over when their newly trained employees get poached by other firms.


Yes and no.

I started my career with a graduate program from a larger company. I stuck around in that company for close to 5 years and would have liked to stay longer. My reason for leaving were the absence of a career progression. The first 3 years, the company had a great career progression path. Clear outlines what it needs for a promotion, fair and transparent pay, etc.

That changed and despite hitting/exceeding my goals, I was denied a promotion twice with no good reason. My boss, who is fantastic, told me that he cannot give me a good reason because he himself did not receive one. So I left.

Generally speaking, my cohort of the program was part of the company much longer than most employees. I don't think a single person left in the first 3 years. Attrition only started now that there was a general shift in the companies culture and communication.


That dynamic is nothing new. Years of experience to become a senior engineer is not “training” and not covered by what you’re describing.

The shortage of senior engineers will be even worse than it is today.

Not sure your argument really holds any water over a 10+ year period as I originally described.


It honestly seems a little control freakish to think this way. People leave companies and that’s a good thing, they explore the industry and generally become more capable. If you leave on good terms there’s nothing holding back a renewed relationship, now with the added benefit of new perspectives; maybe meeting at conferences or working on a project. My gut is telling me these companies don’t part on good terms with their employees.

Am I being dumb: they say it's "open-source software" but I can't actually find a link (or links) to the software / source anywhere on the office.eu website??

Yes, I searched for the same. No evidence this has anything to do with the European Union. More like a vibe-coded landing page with user signup form.

Edit: I am certain this is one or two people vibe coding then will pitch to VCs when the waitlist has 1000 people.

Listing major company logos in their banner: “The organizations listed here use similar technology (Nextcloud) as part of their operations. Their inclusion is for illustrative purposes only.”



Oh, there's some reference to NextCloud - so is this just a white-labeled NextCloud? Or a straight-up AI generated rip-off / resell?

>"Office EU is a European productivity suite for files, email, calendars, documents and calls, built on Nextcloud Hub. It brings Files, Talk, Groupware and Office together in one platform."

https://office.eu/faq

Of which, Files, Talk, Office and Groupware are all just NextCloud services where they've swapped "Nextcloud" for "EU" in the name.

>"Office.EU is a service offered and operated by EUfforic Europe BV, registered with the Dutch chamber of commerce under registration number 98746243 and having its address at Dr. Kuyperstraat 10-A at (2514 BB) The Hague, the Netherlands."

I wouldn't personally trust a company that appears to be claiming another company's services as some revolutionary new thing, when it's just reselling them. And it was registered in November 2025 with no other information available - why would anyone gamble all their company data on a company that has appeared as quickly as it might disappear? Who are the owners/founders even?

Anyway, this was a waste of time.


Nextcloud does not provide hosting, only 3rd level support. So any commercial hosting of Nextcloud will be done by other companies. There are many companies to choose from.

https://nextcloud.com/partners/

Personally, I would only choose companies that are listed as a partner because then I can see what level of support they buy from Nextcloud.


It says that it's based on Nextcloud, which is AGPL, so they better F:ing cough up the source code :-)

"Sir", not "Mr." if you're going to be pedantic about titles ;)

Edit: Oh and he has multiple honorary doctorates (at least 6!), so would be just as much "Dr." too!


Lol you are totally right! ;-)

I am normally a casual guy but for a giant being a bit more formal (pun intended) seems appropriate. Or maybe I am a nerd through and through :-)


It is not usual to call people with an honorary doctorate "Doctor" except in the context of the awarding institution. Most likely the awarding institutions will have actually specified that the recipient should not give anybody the false impression and I can't imagine Tony is the type to do otherwise.

His title at Oxford was 'Professor', and he was addressed as 'Tony'.

He made incoming DPhil (PhD) students a cup of tea individually in his office at the Computing Laboratory. It was a small group, but still I appreciated this personal touch.


I never met Tony, but I liked his work. I'm not much of a one for tea, but I don't think either of my PhD supervisors ever bought me a drink - I didn't finish (got cancer, I'm fine now†, some cancers are very curable, but frankly I was struggling anyway so it was a good excuse to quit) and I'm sure it's traditional to buy something a bit harder than a cup of tea if you pass, but I didn't get that far.

Anyway my point here was just a PSA that honorary degrees "don't count". If somebody only has an honorary doctorate but insists on being called "Doctor" they're an asshole. In fact, even outside University I know a lot of MDs and PhDs and in most contexts if they insist on the title "Doctor" they're an asshole even though they're entitled.

† Well not fine, I'm old but I think that's an inevitable side effect of surviving so the alternative was worse.


There's having An honorary degree... and then there's having 6 of them plus numerous other awards, and all the achievements to back them up :)

Regardless, I've met people with only honorary doctorates, and it's a mixed bag when it comes to preferred titles. Often, though, the ones that really care, soon acquire a 'superior' title anyway, so it ends up becoming a moot point.


You’re right. And ‘Professor’ comes and goes with the job, independent of degrees held.

And, of course, the Go programming language.

I would not say he invented Go, although Go is probably the only relevant implementation of CSP nowadays.

I was adding Go to the list at the very end of the comment:

>OpenMP also inherits many of those concepts, and some of them are also in CUDA.


This post appears to have been hidden from the front page of HN?

Yes, it was submitted before the news had been confirmed. More here: https://news.ycombinator.com/item?id=47327440.

“Inspired by” is an understatement of the century lol. David May and Sir Tony worked very closely together to enable the architecture to be as pure a runtime for CSP as you could get - at least in early versions of the architecture and accompanying Occam language. It expanded and deviated a bit later on iirc.

Source: David loved to tell some of these stories to us as students at Bristol.


It’s also worth highlighting that the mathematical purity of the designs were also partly the problem with them. As a field, we’re still developing the maths of Effects and Effectful Algebras that are needed to make these systems both mathematically ‘pure’ (or at least sound to within some boundary) and ALSO capable of interfacing to the real world.

Transputer and Occam were, in this sense, too early. A rebuild now combining more recent developments from Effect Algebras would be very interesting technically. (Commercially there are all sorts of barriers).


Further Reading for the curious:

On specifically the relationship between Occam and Transputer architecture: http://people.cs.bris.ac.uk/~dave/transputer1984.pdf

Wider reading: http://people.cs.bris.ac.uk/~dave


Inmos’ Occam-based verification of their FPU in collaboration with researchers at Bristol and Oxford iirc? Citation: http://people.cs.bris.ac.uk/~dave/formalmethods.pdf

David May was my PhD supervisor and always spoke very highly of Sir Tony Hoare.

Edit: I’m also lucky enough to have worked with Geoff Barrett, the guy that completed that formal verification (and went on to do numerous other interesting things). Some people may be interested to learn that this work was the very first formal verification of an FPU - and the famous Intel FPU bug could have been avoided had Intel been using the verification methods that the Inmos and University teams pioneered.


I actually had two PhD advisors [1]; Jim Woodcock and Simon Foster.

Both of them are legitimately wonderful and intelligent humans that I can only use positive adjectives to describe, but the one I was referring to in this was Jim Woodcock [2]. He had many, many nice things to say about Tony Hoare.

[1] Just so I'm not misleading people, I didn't finish my PhD. No fault at all of the advisor or the school.

[2] https://en.wikipedia.org/wiki/Jim_Woodcock


I remember Jim Woodcock as really inspirational - he was working with my PhD supervisor in 1987. We were working on a variant of Z for specifying what, today, we would call CRDTs. I was also lucky enough to meet Tony Hoare the same year and discuss those concepts.

Jim is an amazing guy. One of the rare people who are absolutely brilliant in their respective field, and are equally good at teaching the subject. He's also just a really kind, nice person who is delightful to chat with, though that's true of pretty much anyone in York [1].

I also think his book "Software Engineering Mathematics" [2] is an extremely approachable book for any engineer who wants to learn a bit more theory.

As I said, my dropped PhD is not a failure in any capacity from my advisors or the school, mostly just life juggling stuff.

[1] I don't know why exactly, but of all the places I've been, York has the highest percentage of "genuinely nice" people. It's one of my favorite spots in the UK as a result.

[2] https://a.co/d/02M25LcY, not a referral link.


Yes, they contributed to open source - this is a good thing.

But personally, I took issue with the tone of the blog post, characterised by this opening framing:

>For many years we had to rely on our own internally developed fork of FFmpeg to provide features that have only recently been added to FFmpeg

Could they not have upstreamed those features in the first place? They didn't integrate with upstream and now they're trying to spin this whole thing as a positive? It doesn't seem to acknowledge that they could've done better (e.g. the mantra of 'upstream early; upstream often').

The attempt to spin it ("bringing benefits to Meta, the wider industry, and people who use our products") just felt tone-deaf. The people reading this post are engineers - I don't like it when marketing fluff gets shoe-horned into a technical blog post, especially when it's trying to put lipstick on a story that is a mix of good and not so good things.

So yeah, you're right, they've contributed to OSS, which is good. But the communication of that contribution could have been different.


> e.g. the mantra of 'upstream early; upstream often'

This is the gold standard, sure. In practice, you end up maintaining a branch simply because upstream isn't merging your changes on your timescale, or because you don't quite match their design — this is completely reasonable on both sides, because they have different priorities.


> Could they not have upstreamed those features in the first place?

Hard to say without being there, but in my experience it's very easy to end up in "we'll just patch this thing quickly for this use case" to applying a bunch of hacks in various places and then ending up with an out of sync fork. As a developer I've been there many times.

It's a big step to go from patching one specific company internal use case to contributing a feature that works for every user of ffmpeg and will be accepted upstream.


I've also had that experience of patching an OSS project internally, with the best intention of upstreaming externally-useful improvements in the future (when allowed).

However, my interpretation of the article was that they did a lot more than just patching pieces. They, perhaps, could have taken a much earlier opportunity to work with the core maintainers of ffmpeg to help define its direction and integrate improvements, rather than having to assist a significant overhaul now (years later).


Getting something accepted upstream is orders of magnitude harder than patching it internally.

The typical situation is that you need to write a proof of concept internally and get it deployed fast. Then you can iterate on it and improve it through real world use. Once it matures you can start working on aligning with upstream, which may take a lot of effort if upstream has different ideas about how it should be designed.

I’ve also had cases where upstream decided that the feature was good but they didn’t want it. If it doesn’t overlap with what the maintainers want for the project then you can’t force them to take it.

Upstreaming is a good goal to aim toward but it can’t be a default assumption.


> Could they not have upstreamed those features in the first place?

This can be harder than you think. Some time ago I worked a $BIGCORP and internally we used an open source library with some modifications to allow it to fit better into our architecture. In order to get things upstreamed we had to become official contributors AND lobby to get everyone involved to see the usefulness of what we were trying to do. This took a lot of back-and-forth and rethinking the design to make it less specific to OUR needs and more generally applicable to everyone. It's a process. I'm not surprised that Facebook's initial approach would be an internal fork instead of trying to play the political games necessary to get everything upstreamed right off the bat. That's exactly the situation we were in, so I get it.


I guess it is much more frequent to maintain internal patches rather than doing all the merging work into upstream, especially the feature is non-trivial. Merging upstream consumes more time externally and internally, and many developers are working with an aggressive timeline. I don't think it is fair to criticize them because they didn't do ideal things from the beginning.

> Could they not have upstreamed those features in the first place?

Often when you are working on a downstream code base either you are inheriting the laziness of non-upstreaming of others or you are dealing with an upstream code base that’s really opinionated and doesn’t want many of your teams patches. It can vary, and I definitely empathize.


I find it hard to be too upset, better late than never. Would it have been better to upstream shortly after they wrote the code? Yes. Would it have been better if they also made a sizable contribution to fmmpeg? Yes. But at the end of the day they did contribute back valuable code and that is worth celebrating even if it was done purely because of the benefit to them. Let's hope that this is a small step and they do even more in the future.

As I said, the contribution is good, it's the communication via this blog post that I don't entirely like. It could have been different. It could have acknowledged better ways of engaging with ffmpeg (that would've benefitted both Meta and ffmpeg/the community, not _just_ ffmpeg).

But corporate blog posts often go this way. I'm not mad at them or anything. Just a mild dislike ;)


Yeah, I see what you mean. It basically shows that they contributed to ffmpeg purely because it helped them, but then they wrote this post to get good will for that contribution.

:thumbs-up:

I'm glad to know that outcomes are affected by having pure intentions. /s

I’ll take it. Metas purpose isnt to help the community, it’s to make money. Sucks to hear that out loud, but that is how capitalism works.

But you can use that to steer Meta. Explain how doing x (which also helps the community) makes them more money.


>For many years we had to rely on our own internally developed fork of FFmpeg to provide features that have only recently been added to FFmpeg

I really wonder if they couldn't have run the fork as an open source project. They present their options as binary when it fact they had many different options from the get go. They could have run the fork in an open-source fashion for developers of FFmpeg to see what their work was and be able to understand what the features they were working on was.

Keeping everything close source and then contributing back X amount of years later feels a little bit disingenuous.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: