Hacker Newsnew | past | comments | ask | show | jobs | submit | cfr2023's commentslogin

I heard David Letterman say within the past decade that on the days he did not feel the show was very good, he couldn't bring himself to even leave the building until it was dark outside, out of shame.

Meanwhile, at home, I would practically be shaking with excitement waiting for the show to start, I so appreciated it.

This tale somewhat suggests that people who perceive themselves as incompetent and inauthentic might spend more of their time striving at work, which could raise the bar, maintain a high standard and eventually breed something resembling confidence.

Or it could just continuously undermine their natural confidence and sense of self-worth and debase them such that they are easy to overwork and manipulate. It can also just feed into fears that invalidate the satisfaction of any jobs well done, leading to burn out and feelings of futility.

People who are overly confident can behave brashly and do damage, while automatically imposing costs on others, in the form of the time it takes to crack through their false beliefs or the duplication of effort it takes to walk back their mistakes.

So, this is kind of a nothing post, basically a lament. It's not clear whether suffers of Impostor Syndrome or Dunning-Kruger type symptoms have an easier path to a more moderate position, but each one seems likely to be rampant in just about any workplace.


I started following the RED story before those folks ever released a camera, and I liked their spirit and mission.

Some time passed and ultimately it was Black Magic Design that accomplished what RED said they wanted to do.

If you say you want to make high end cinema technology, or even just high quality imagery, accessible to the average person, a $17,500 price tag for just the camera body shows that you might have strange ideas about what constitutes an average person.


I think if you know the RED story you know that at the time there were effectively 0 consumer-tier high end digital cameras. We're talking basically the advent of the DSLR revolution, where either you shot on a Canon 5D MK II or... an Alexa? Alexas retail around $50,000 (and weren't out until 2010), so RED offering actual 4K video digital camera with an easy conversion to EF mount glass (Canon) and a body that is literally half the size of an Alexa AND was consumer-purchasable at $17,500 (Alexa purchase process isn't "just buy on B&H") - it was huge.

The other thing is that the camera market and the concept of "consumer" isn't really like normal "consumer" end stuff. High-end digital camera "consumer" stuff has different purchase cycles that traditional "consumer" things like iPhones don't have. Camera Operators/DPs typically buy these huge cameras and then rent them out or bill their cost back into their day rate.

When RED says consumer, they mean that any person with money can buy one. Alexas, Panavision Cameras, Fony F65s, etc. all usually need to be bought by a cinema rental house and then are rented to operators. RED went around that and allowed people to buy cinema-tier cameras directly, which was huge. The market has adjusted since then and I think Blackmagic (and the Sony Alpha line) now more directly serve traditional definitions of "Consumer", but IMO none of that would have happened without RED paving the path.


Yo there were way more video cameras back then than just the Canon 5D and ARRI. News organizations, smaller productions and documentary makers were not just whipping around expensive ARRI’s. Sony and Panasonic made a ton of other professional video-focused cameras. I have an old Panasonic HVX200 right next to me.

But the Red One was definitely still extraordinary because they managed to make a relative-cheap production 4K camera in 2007.

That said, the impact was muted because people didn’t really care about 4K as much in 2007. I don’t think ARRI even released a 4K camera until years later.


ENG cameras had sensors a fraction of the size of Red/5D [1] so the resultant crop factor and CCD sensors paired with not-great glass choices (mostly zoom lenses, fixed focal length rare, etc.) made nice DoF/Bokeh hard to come by, making the output of them not really look like what a lay person would say "cinema" looks like. If you want to talk real Cinema-DV options from around that time (and ignoring punk cinema stuff like Dogme 95/Harmony Korine), you're mostly talking about the Viper, which again is a six-figure camera.

> The impact was muted because people didn’t really care about 4K as much in 2007.

This is just not true. Look at the Wikipedia post here [2] and then also just look for movies shot on the One/MX/Epic. The camera was immediately adapted into hollywood feature productions.

[1] https://therobbcollections.blogspot.com/2019/09/digital-came... [2] https://en.wikipedia.org/wiki/Red_Digital_Cinema#Red_One


The role of ENG video is not precisely the same thing as digital cinema, though there's some overlap. I remember a lot of indie filmmakers in that era struggling to shoot for cinema with ENG-focused cameras and gear, and often struggling to get what they wanted out of it.


They’re not but there were ENG-style cameras with interchangeable lenses and 422 output too. There were a lot of options between DSLR and an ARRI.

Still are.


An insightful post indeed.

I was acutely aware of it, shooting projects on horrible looking mini DV and expensive film stock.

No question they spurred progress, I'd just envisioned that they would continue to carry the torch with all of their piss and vinegar.

Now Black Magic Design produces ~$2000 cameras that produce consistently better images than RED to this day.


The thing you'll hear from any camera pro though is that the actual shooting experience of Blackmagic isn't great. Making Blackmagic's behave for film is its own cottage industry and there's a reasons most people use stalwarts like Arri/Panavision/Sony/Red. If you're an entry level videographer, I'd much more recommend going the C300 route than being enticed by the BM price tag, as there are a lot of other hidden costs to just make things Work Well.


The most affordable kinds of cameras at the time that you could realistically use for something going to theatrical release was the (1080p) Sony F900 and then F950, which were in the $250K ballpark… Then the Arri D-21 came out, I can’t even remember what price but same ballpark, it was a bit higher than 2K res. $17,500 for 4K was wild, and it was insane they actually managed to deliver it with the RED ONE.


All good points, though following these acts of instigation for the industry, their competitors overtook them in the path to their goal.


>If you say you want to make high end cinema technology, or even just high quality imagery, accessible to the average person, a $17,500 price tag for just the camera body shows that you might have strange ideas about what constitutes an average person.

To be fair, their competition at the time was $200,000+ Panavision rigs that were completely prohibitive to independent filmmakers.


A fair consideration indeed but a very relativistic use of terminology. $17,500 is not $200,000 that's for sure, but it's also not $5000 or $2500 or $100 (not that I expect $100 cine cameras).

My only point is that their hearts were in the right place, but they may have ultimately done their best work as instigators.

As well, despite my appreciation for their company, I never liked the images from their cameras.


It’s not a hobbyists’s impulse buy, but it puts a week’s rental at a couple grand - that’s plausibly a group of upper middle class teenagers. It’s also something the equipment lending library in a media studies department can make available to student projects.

Big step up from shooting on iPhones or hacked DSLR bodies, for a relatively small (in the universe of film production) increment in budget.


Very fair points. I'm just nit picking about RED's instigation and influence being more valuable to camera industry than the cameras they delivered and the price points they delivered them at.


Z-Cam has full frame 8K cameras for $6000 and full frame 6K cameras for $3000 which is sort of affordable.


Not sure of the tone of this post, does the idea of a bunch of enthusiasts having fun with old computer hardware actually upset you? If so, do you feel similarly about hobbyists that tinker with things like old cars, old clocks, old musical instruments...


> does the idea of a bunch of enthusiasts having fun with old computer hardware actually upset you?

You said "hardware." I specifically gave thumbs-up to hardware. Did you not read that?

As for reinventing basic networking: no, it's not upsetting, but it IS self-limiting. The networks of the future will probably require brand-new thinking, not resurrected old stuff.


I read it all right but it is illogical ranting. You’re cool with people tinkering with hardware but any software work is out of line?

Writing software is a use for computer hardware, and people with minds broader than yours do it for pure enjoyment and exploration. The attitude expressed in your post is self-limiting.


Inappropriate generalizing. What are you, the logic police? Let it go.


What are you, the hobby police?


That's right. My hobby is keeping people like you busy. See how well it works?


Yes, I am proud to do the public service of helping troubled people see they are making senseless arguments about what other people do for fun.


I will add that you are entirely projecting your goals and values onto this project in a way that makes absolutely no sense.

The post literally ends ends with:

"But most importantly, have fun!"

You think a bunch of people who are on record as setting out to have fun are "limiting themselves" by not inventing the future of networking.

In doing so, you've invented the future bad posting.


The trend of touch screens replacing physical controls in domains where muscle memory is an advantage is an utter atrocity.

No amount of interface versatility/flexibility can come close to touching the utility of not having to take your eyes off your subject. NONE.


Not to mention that on cars like Tesla, UI updates will change the location of these buttons.

This drives me INSANE. The other one: some of the Tesla UIs feel like they were made with "minimalism" in mind. For instance the rear defroster vs the windshield defroster. I still, 3 years into my model Y, have no idea which is which, and every time I need to defrost the front windshield it's like a fight against the HVAC system and buttons and touchscreens to make it do anything.

I love my tesla, probably to the point of being annoying, but I **HATE** the ridiculous "minimalist" UI stuff, and I absolutely hate it when they push a UI update which moves things around.


How is it possible to love a car that does things you hate when there are other cars at the same price that does everything the object of your love does, and doesn't do the things you hate?

It sounds like an irrational love for the car.


It is legally required for all complaints about Tesla to start with "I love my car, but..."

(This is so near-universal that I vaguely suspect that there's a non-disparagement agreement you have to sign when you buy one).


>other cars at the same price that does everything the object of your love does

Because there aren't? My tesla has FSD which I use for the majority of my driving, it looks cool, it's really fast, I really like the in dash display (just don't like UI updates, and some very specific parts of the HVAC controls).

This is such a funny question to me. Do you love your city? Is there *nothing* you dislike about it?


Which other car in the similar price range can fart on demand from the mobile app?


What's your list of other cars that have the same charging network access and self-driving capabilities?


> What's your list of other cars that have the same charging network access and self-driving capabilities?

Well, the list of other cars that will kill you if you take your hands of the wheels is ... just about anything, right?

And then you're dead, so have no use for a charging network anyway, too.

So, basically, for just that one feature, you can use just about any other car.

The list is, essentially, everything else!


And yet we drove those cars for 100 years, with our hands on the effin wheel.


Somebody read too much CNBC and/or NYT.


What if I don't take my hands off.


In part Europe thew answer for the first is "all of them".

(And yes that works. I recently finished a 2k+ road trip in Germany and had no issues at all. Plug and charge worked flawlessly on every DC fast charger I visited. AC charging worked by swiping my RFID card).


Why do you love a car with anti-features that annoy you so much? I am genuinely curious, as I have only driven a Tesla once or twice.


I don't have a problem so much with the touch screen itself. It's a waste for a lot a things and I frequently just turn my screen off, but it is nice to be able to bring up a map with directions and arrival estimates.

But I am constantly disappointed by just how awful and useless the software is.

Need some directions? Sorry, I can auto-play this music station you haven't used in a week, but if you want those directions you looked up on your way out the best I can do is (maybe) have the address in your recent search history.

Want to resume the music you were streaming from your phone through your media center? Yeah, just give me a few minutes to load up this other UI and...Are you sure you have a music app on your phone? Maybe you just need to add it to the car app? Here, let me bring that up on your phone screen. Hold up. There's some audio coming through the bluetooth, I'll just play that.

Want to see why the "Check Engine" light came on? Oh, well for that you need to buy a $50 dongle with Bluetooth and install an app on your phone.


I hate them in all applications that don't benefit from touch and even in many that do. For example, electric cookers. Despite being easier to clean I still find them absolutely infuriating to interact with, plus cats can activate them.

Most of the time, though, they are implemented simply because it's cheaper. There's no benefit to speak of. In fact, I think the only device for which a touch interface works is a smartphone. I can't think of any others.


I laughed out loud. Is this actually a common sentiment in some circles, or just someone blowing off steam?

The question occurs to me because I feel like I just spent 30 years on forums like HN reading nothing but effusive praise for the cleverness and elegance of C and Unix.


This hilarious book was published 30 years ago: https://archive.org/details/unixhatershandbo0000unse_c3g9


In the 70's and 80's C and Unix were incredible tools that were so close to useless that they could run on cheap computers, and so unappreciated that you could get them with a personal budget or for free.

They both were extremely important on the popularization of computers and on unlocking the huge amount of value they provide today. But not due to any quality that we value today.


Good perspective. Kinda leaves me with "these technologies had their day but we've mostly moved on."


As soon as a viable alternative pops up for nix which has all the advantages of current incarnations - of which 'open' and 'free' are but two of the more important ones - they'll take over the world. Until such a time we'll keep on using our nix-hammers just like carpenters have been using their hammers (nowadays often driven by electricity or air) because they work well enough for the intended purpose, the occasional blue thumb notwithstanding.


The thing is, we didn't move on.

We hacked most of the advantages of anything newer back into them, in a haphazard way, and kept them because as a sibling pointed out, nowadays they are open. And openness is a very important feature. (It's just not why they were adopted, people cared so little about openness that Unix was born open and mostly closed up later.)


>> reading nothing but effusive praise for the cleverness and elegance of C and Unix

It isnt that these are GOOD ideas, it's just that no one has come up with better ones.


I think this is an important distinction and actually sort of a brave one.

That a technology stack can be the basis of an entire industry and still be unappealing and lacking for people that are obliged to interact with it directly/regularly.


I feel this way about SQL—it's amazing to me that we're still using that same basic interface after 50 years; it’s so goofy and unpredictable, and I’m sure nobody today would design it that way if we were starting from scratch.


What do you mean? Even when those were created there were better ideas. Rust, Java, Javascript, Windows, Android, etc all are better ideas than C and UNIX.


Unix and c far far predate all those choices.

The reality is that any tech decision can later be replaced with "something better".

Much of the debate is bugs and daffy screaming "duck season"... "rabbit season" at each other.


It seems every domain and human endeavor in existence has some form of disagreement between practitioners who desire progress/advancement and people who are content to never change or learn anything new, in spite of glaringly obvious benefits.

It's brave to say that no one has come up with better ideas than Unix and C because it's bound to rile up users of (your favorite platform + language here).

I also think that someone saying that there aren't any better ideas than Unix and C might just have different values/interests in computing.


You don't get to say for/anti progress when there isn't a consensus definition of progress.

All progress is change. All change is not progress.

A programmer or someone presuming to opine on programming, who overlooks a thing like that, exposes and advertizes that their opinions in such a domain are of questionable value.


The stupidity of this post keeps me coming back to see if I can get more humorous broken logic from you.

Because a rigorous operational definition of “progress” is not provided in this brief post, you assume it is missing, just to heckle someone making the uncontroversial claim that there will always be people on both sides of initiatives intended to foster progress in a given area. A hilarious thing to be triggered by.

How would “achieving an organization’s mission statement within budget” or “improving working conditions for knowledge workers by creating more accessible tools” or “using fewer labor hours on repetitive tasks” or “creating custom tools tailored to specific tasks and using less electricity”.

But maybe any of those non-technical goals can all be achieved using the same old tech, and it’s people complaining about their feelings of disconnection from ancient telecom vestiges that are really impeding progress. Maybe the it is the masses that don’t get it, and it is the select few that truly understand things that get to define progress, while insisting that the power is kept in their hands, and that the work is done in their preferred paradigms.

Or maybe I am projecting all of this onto you to return the favor lol


Seriously though, if it is your world view that a consensus definition is needed for an initiative to be considered truly progressive, has the world made any progress at all in any domain?

Fossil fuels are widely implicated in climate change, opponents want to see them phased out, but I doubt they would deny that their use has ushered progress for humanity.

Your reply suggests you might be feeling hurt that someone has picked on your favorite tech stack or that you're getting bullied at work by people who see you as closed minded. They might be on to something.


>You don't get to say for/anti progress when there isn't a consensus definition of progress.

You seem to think that a firm consensus definition is needed for something to be considered progress, which exposes and advertises that your opinion in the domain is highly combative and dysfunctional.


It is brave to call out C and Unix as outdated and technically inferior tools/solutions when so many users are excessively dogmatic in framing them as a pinnacle for the computer industry.


UNIX and C have truly ruined generations of programmers. Sure they may be practical, but having a world view that this hacky software from the 70s was the pinnacle of good design that should be continued to be emulated is a shame. For some people the way UNIX works is their mental model of how all computing works and they are not willing to accept change to it.


Purely from the perspective (my own) that unique/novel mental models are often key drivers of progress, I am very much inclined to agree with you.


I don't think this sounds facetious. People and their life experiences are essentially inextricable. Nature vs. nurture stuff.

Separating/abstracting person from their experiences is as you say, a representation of a different, non-existent person - a mutant. Or worse still, some kind of dissection.

Disliking peanuts and being allergic to them each motivate abstaining from eating peanuts, an imitation of that person that abstained from eating peanuts without insight into why would be a rather pale imitation indeed.


My god, this (very insightful) vision just gets more horrific with each line.


It does seem that people would wrongly associate the artificial influx of positive/warm feelings coming from a cannabis with increased task performance. Not unlike how people overestimate their own strength when drinking alcohol etc.

However, if it decreases their performance but improves their feelings towards the work, might that still be a net positive?

In other words, if you are going to be a little more pokey and error prone, but on the whole more eager to work and more resistant to burn out, that could be better than a non-cannabis user that makes fewer errors, while continuously stressing out and thinking about changing jobs.


My coding under the influence is for shit. Most times after trying this, I’ve had to redo what was done as it no longer makes sense and rarely worked. It was the epitome of the phrase “were you high?”

The opposite happens though for the more creative computer usage like video/photo editing, graphic design etc. there have been some severely boring projects that kept me procrastinating, but made tolerable with an influenced state


My college programming homework under the influence of alcohol always worked but the result was usually strange and horrible and had to be redone.


Is it better to medicate than make steps ("thinking about changing jobs") towards being happy sober?

I'm not against people using cannabis, I just don't think this is the right frame. It shouldn't be a bandaid for being in a bad situation and it's kind of sad if one would give up their aspirations for betterment because the high masks them caring enough about it.


Your point is valid in that being able to face life unaided by medicine is ideal, but often unrealistic.

Such that I think this comment of calling it a band aid for being in a bad situation is overblown, and that the situation in general is more comparable to drinking coffee to remain alert.

I'd be curious to know if you also think that drinking coffee to overcome tiredness at work is also equal to giving up aspirations for betterment.


I see cannabis less as a bandaid (though it can be used that way) and more as an experience enhancement. Sure, maybe data pipelines seem a little boring. But spend enough time working on them under the influence of cannabis and before you know it you’re having fun.


In the short term, for the individual: yes, it could be better to medicate.

However in the long run, it ends up giving license to shitty employers to continue being shitty, and perhaps to become even worse.

Such employers need to feel the negative consequences of their behavior. Their business needs to suffer from high turnover and poor quality work, and if they don’t change then they need to fold.

Businesses are, at the end of the day, just people - people who live together and participate in the same society. I can’t honestly say that the best outcome is for workers to medicate themselves to be able to tolerate their bosses. Work doesn’t have to be so toxic and awful, you know?


I'm the farthest thing from a pot advocate there is, but if you imagine saying this about antidepressants it sounds kinda silly.


Well; I’m inclined to agree that my comment would indeed sound silly if you simply substituted antidepressants for pot. But in my mind that’s because it’s comparing apples to oranges. I just don’t see how they are comparable.


But you weren’t talking about pot. You were talking about people treating their work related depression and stress being a bad thing because it enables their employer to depress and stress them. At least, that was the message I got.


> But you weren’t talking about pot.

I was. I was replying to a comment that was talking about pot; and the comment I was replying to was on an article that was about pot. In context, I would have thought "medicate" would be understood as "medicate [with cannabis]".

> You were talking about people treating their work related depression and stress being a bad thing because it enables their employer to depress and stress them. At least, that was the message I got.

Nope, nope, nope. You brought up antidepressants - as I said before, I think that's an apples-to-oranges comparison. Recall the original comment I was replying to in the first place - it talks about being "more eager to work and more resistant to burn out" via pot and people "continuously stressing out and thinking about changing jobs". And I don't think it remotely bad to treat depression or stress with medication, at all.

Now maybe somebody is about to burn out and thinks about changing jobs because they're depressed, but that's reading a lot into it. My comment was trying to talk about leaving toxic and shitty jobs. You're bringing a framing into it that I do not agree with and didn't ever say.



Why are you so obsessed with trying to tie my comments to marxism? That’s deeply odd.


This is a thinly veiled marxist-style call for collective class action. Aka unionising. I think it’s more dignifying to assume that ppl are capable of judging for themselves whether the solution to a shitty job is getting high, or quitting- without the socialist indoctrination, which is designed to benefit the political leaders who try to galvanise revolutions. Those revolutions typically devolve into unhinged violence, because they fabricate a narrative of anger.


If I wanted to make a marxist-style call for collective action, there’d be nothing “thinly veiled” about it.

What I was actually suggesting is that people should, in fact, decide for themselves; and that employers should reap the consequences of those decisions.

It sounds like you’ve mistaken me for someone you’ve been arguing with elsewhere. Maybe my comment vaguely resembles a fight you’ve been having before, but you’re reaching some conclusions that just aren’t supported by what I said.


It was the tone you used. Your post had strong statements like

> employers need to feel the negative consequences of their behavior. Their business needs to suffer

This is not merely encouraging personal autonomy. It’s the prescription of political action. It’s dialectical materialism. Could be a strange coincidence.


Well, if all it takes to be a marxist is suggesting that people take individual action to change a socially undesirable situation, I guess I’m a marxist.

But I mean with definitions that broad, who wouldn’t be? I guess it’s useful for trying to shut down conversation, but I’m not sure what other utility it holds.


But you didn’t suggest individual action. You set out objectives which can only succeed if enough ppl coordinate to do it. It’s the prototypical socialist revolution and it’s useful to know where that path can lead.


> But you didn’t suggest individual action. You set out objectives which can only succeed if enough ppl coordinate to do it.

I suggested individual action; you attached success criteria to it that support your weirdly strong desire to paint me as a marxist.

Now, I don’t know why you’re so adamant on it, but that’s your business I guess. I think it’s kinda weird honestly.


Better for who?


Everyone? It's about the real and hidden costs.

If your employees are stressed to the point of burnout and are turning over every few months, do you think new trainees will make more or less errors than experienced employees under the influence of cannabis?

Will their training costs be more or less than the errors made by the person using cannabis?

As the employee, if you are stressed out and hitting a wall, and suddenly imbibing cannabis motivates you to continue working more than 10 cups of coffee would, but without the jitters, that seems like a better outcome than simply feeling terrible and also not getting the job done.


I suppose there’s no certainty when dealing with a vague hypothetical, but probably both employer and employee? It’s not as if burnout and quitting is in anyone’s interest.


Is there no value in merely separating and gathering this material, even if it's all dumped in a "plastics only" landfill, as opposed to one containing other types of trash?

Isn't keeping different types of waste sorted inherently useful in other ways?


I suspect you’re right. From my hometown’s local waste authority: After [we have] collected the waste from your bin, the purple dotted bag [with plastic waste] is sent on for collection and pressing before being transported to large sorting facilities in Northern Germany and other places in Europe. Here, 75% of the content in the purple dotted bag (figures from 2019) is recycled, and the remaining waste (often residual waste and food waste that has been incorrectly sorted) is utilized for energy.

And more detail: This is how household plastic is recycled: Collected plastic packaging is first finely sorted in Germany, before moving on to recyclers in the same country or elsewhere in Europe. We differentiate between the recycling of household plastic, which is the plastic you sort at home, and plastic packaging from businesses. - The plastic packaging you sort at home is collected by the municipality or an inter-municipal waste company. They press the plastic together into large bales. - Plastretur transports the plastic to a sorting facility. - At the sorting facility, incorrect sorting and contaminants are removed. This can include paper and metal, or other items that ended up in the wrong collection at your home. - Additionally, the plastic is sorted into different qualities, and non-recyclable packaging is removed. The sorting facilities classify the plastic into several plastic qualities such as PP, LDPE, HDPE, PET before it is transported to a recycling plant. - At the recycling plant, the plastic is washed. This process also removes labels and glue. - Then it is ground and melted into small beads, called pellets. Pellets are a raw material that can be used to make new products from recycled plastic.


> remaining waste is used for energy

This really is the key - if people weren't so scared of incinerators, the "landfill" problem would be basically solved.


I still don’t fully understand. Would there be unwanted gasses coming out? Could we not burn the specific plastics where it’s economically worth it for electric generation? I don’t know the full science so my assumptions might be wrong but I wonder how much worse it is compared to coal and natty gas plants?


Probably, but, separating is actually a really hard problem. People do not do as good of a job separating their own recycling as they think they do. Others don't care at all. MRFs are expensive due to maintenance, labor, and other operating costs, and if there's no cost recovered from the recycling there's little to fund their existence.


As a challenge: try to separate waxed cardboard food containers from polyethylene-coated cardboard food containers. They look almost identical. They feel quite similar. Nonetheless, the former is compostable, whereas the latter is not and also contains non-recyclable plastic. (Good luck getting the polyethylene off the cardboard. Maybe you could derive new polymer feedstock from the whole mess, but this seems no easier than turning compostable waste into plastic in general.)


I've toured our local MRF and was shocked at how good it was at robotic / magnetic separation. Incredibly fast AI trained 'pickers' pulling material off a belt (sorting by visually distinct plastic types, e.g. PET vs opaque plastics, paper, etc), combined with some human labor and magnetic extractors. Very high capture rates for all plastics except films - those remain hard to sort / grab. I was not expecting this level of performance...


Valuable considerations.

Seems like a perfect machine learning/robotics application, but probably not one that will receive funding if it's pitched with the objective of, "look now all of this problematic crap is in one place."


I don’t think separating the disgusting mess in mixed recycling is an ML/robotics problem. Neither robots nor human hands can turn food-splattered paper into clean recyclable paper. Nor can they turn plastic/paper composite materials into anything else. Or dirty plastic bags or dirty polystyrene foam or (yuck!) fluorinated HDPE.


Fair observations but, doesn't this assume that disposal and containment are truly our only options now and forever?

Even so, you think there would be no worthy efficiency gains for pure sorting or materials mining in using machines that can identify known objects effectively, and hands that will go where humans will not?


There are solutions that don’t involve reaching into the mess and sorting it.

Some places incinerate their waste for energy. If done well, pollution (except CO2) is minimal, and the outputs are energy and ash. Perhaps some day the ash could be processed to extract useful minerals.

More generally, if you imagine waste to be a clumpy soup of organic goo, assorted interesting elements and minerals, and polymers (which are technically organic goo but are sort of worthy of a different category), perhaps it could be treated as such and processes could be developed to economically extract useful things from it, kind of like how geological processes turn dead cells into oil.

But I see no fundamental reason that, say, old polyester needs to turn into new polyester as opposed to anything else. And keep in mind that, even if burnt to CO2 and ash, there are processes that use energy that turn CO2 into valuable chemical feedstock.


Much to consider here, thanks for the thoughtful reply.


I want to storyboard/pre-vis/mess around with this ASAP


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: