People "at large" absolutely don't care about AI slop, even if they point and say eww when it's discussed. Some people care, and some additional people pretend they care, but it just isn't a real issue that is driving behavior. Putting aside (for now) the idea of misinformation, slop is socially problematic when it puts artists out of work, but social media slop is just a new, sadder, form of entertainment that is generally not replacing the work of an artist. People have been warning about the downfall of society with each new mode of entertainment forever. Instagram or TikTok don't need to remove slop, and people won't care after they acclimate.
Misinformation and "trickery" is a real and horrific threat to society. It predates AI slop, but it's exponentially easier now. This camera, or something else with the same goal, could maybe provide some level of social or journalistic relief to that issue. The problem, of course, is that this assumes that we're OK with letting something be "real" only when someone can remember to bring a specialty camera. The ability of average citizens to film some injustice and share it globally with just their phone is a remarkably important social power we've unlocked, and would risk losing.
Saying that there is a market for a sane social network does not means it's a market as big as the other social networks. You don't have to conquer the world to have a nice product.
> The ability of average citizens to film some injustice and share it globally with just their phone is a remarkably important social power we've unlocked, and would risk losing.
I'd say we've already mostly lost that due to AI. We might gain it back if cryptographic camera signatures become commonplace (and aren't too easy too crack).
> People "at large" absolutely don't care about AI slop
I fear, your statement is impossible to be denied its validity, when "Tung Tung Tung Sahur"-Trading-Cards and "Tralalero Tralala"-T-Shirts are a thing.
"People "at large" absolutely don't care about AI slop"
I think this is true. In general I think enough population of the market actually does not care about quality as long as it exceeds a certain limited threshold.
There's always been market for sub-par product. That's one of the features of the market I think. You can always find what is the cheapest, lowest quality offering you can sell at a profit.
People "at large" absolutely don't care about AI slop, even if they point and say eww when it's discussed. Some people care, and some additional people pretend they care, but it just isn't a real issue that is driving behavior. Putting aside (for now) the idea of misinformation, slop is socially problematic when it puts artists out of work, but social media slop is just a new, sadder, form of entertainment that is generally not replacing the work of an artist. People have been warning about the downfall of society with each new mode of entertainment forever. Instagram or TikTok don't need to remove slop, and people won't care after they acclimate.
Misinformation and "trickery" is a real and horrific threat to society. It predates AI slop, but it's exponentially easier now. This camera, or something else with the same goal, could maybe provide some level of social or journalistic relief to that issue. The problem, of course, is that this assumes that we're OK with letting something be "real" only when someone can remember to bring a specialty camera. The ability of average citizens to film some injustice and share it globally with just their phone is a remarkably important social power we've unlocked, and would risk losing.