20 comments

  • bogdanoff_2 1 hour ago
    The solution to this would be a law forcing these sites to allow third-party suggestion algorithms, so that you can choose who and how content is being suggested to you.

    It could be perhaps as simple as allowing third-party websites and apps for watching Youtube on your phone. And it's okay if this would be a premium paid feature, so there's no counter argument that "it costs them money to host videos".

    This is not an entirely new idea either. Before Spotify became popular, people would integrate Last.FM into their media players to get music recommendation based on their listening history, and you could listen to music via YouTube directly on the last.fm website.

    • mentalgear 1 hour ago
      The solution to all of Big Tech's monopolies is actually pretty simple: Interoperability must become a law - this includes using custom algorithms or allowing other platforms (like your own app) to access YOUR data on whatever platform 'hosts' it.

      Cory Doctorow wrote a great article on it:

      "Interoperability Can Save the Open Web" https://spectrum.ieee.org/doctorow-interoperability

      > While the dominance of Internet platforms like Twitter, Facebook, Instagram, or Amazon is often taken for granted, Doctorow argues that these walled gardens are fenced in by legal structures, not feats of engineering. Doctorow proposes forcing interoperability—any given platform’s ability to interact with another—as a way to break down those walls and to make the Internet freer and more democratic.

      Most notably, he retells how early Facebook used to siphon data from its competitor MySpace and act on user's behalf on it (e.g. reply to MySpace messages via Facebook) - and then when the Zuck(er) was top dog, moved to made these basic interoperability actions illegal by law to prevent anyone doing to him what he did to others.

      • theturtletalks 58 minutes ago
        We can’t depend on these platforms to offer interoperability or even laws to force them to do so. The DMA forced Apple to allow 3rd party app stores in Europe and they still hampered it so rarely anyone uses it.

        We need platforms to offer that interoperability and simply connect to these “marketplaces.” Take Shopify for example, sellers use that platform to list on Amazon, Google Shopping, TikTok shop, etc. We need open source alternatives to those where the sellers own the platform and these marketplaces are forced to be interoperable or left behind by those that are.

        For Facebook, Instagram, Twitter, each person having their own website where they post and that post being pushed to these platforms is also another way to force interoperability on them or be left behind.

        It’s a tall task, but achievable and it will happen given enough time.

        • pegasus 16 minutes ago
          > For Facebook, Instagram, Twitter, each person having their own website where they post and that post being pushed to these platforms is also another way to force interoperability on them or be left behind.

          There's an acronym for this: POSSE (Publish [on your] Own Site, Syndicate Elsewhere). Part of the IndieWeb movement, for those who want to explore this worthwhile idea further.

      • FuriouslyAdrift 22 minutes ago
        That just leads to embrace/extend/extinguish
        • spwa4 16 minutes ago
          Exactly. The deal of all these platforms is that there is a fuckton of up-front costs. Hard drives. Networks. Peering. Transit. Operators. Payment. Lawyers. SREs. And so on and so forth.

          The solution to this used to be that governments provide the platform. You would think this wouldn't be hard to do, since people have now shown that this can work and so it's a guaranteed money maker, or as close as you're going to get.

          Yet I can't find a single initiative.

          So any such rules will just make all internet platforms disappear ... and nothing.

      • mschuster91 1 hour ago
        The foundational problem with interoperability is that it can and will immediately be abused by bad actors as long as there is no price tag attached to every piece of communication.

        Among social media, Mastodon (and anything Fediverse) has it the worst, obviously, but Telegram and Whatsapp are rife with spams and scams, Twitter back when it still had third-party apps was rife with credential and token compromises (mostly used to shill cryptocurrencies).

        As for the price tag reference - we've seen that with SMS. It used to be the case that sending SMS cost real money, something like 20 ct/message. It was prohibitively expensive to run SMS campaigns. But nowadays? It's effectively free at scale if you go the legit route and practically free if you manage to get someone's account at one of the tons of bulk SMS providers compromised. Apple's iMessage similarly makes bad actors pay a lot, because access to it is tied to a legitimate or stolen Apple product serial.

        • banannaise 57 minutes ago
          Paywalls can have the opposite of the effect you want. Implemented incautiously, they can fail to disincentivize parties who can make profit in excess of the cost, and it can succeed at disincentivizing genuine, non-profit-motivated interaction.

          Imagine how much less you would use text messages if they still had a per-message cost.

          • malfist 18 minutes ago
            I would reply to your comment, but my 2GB data allocation for my cell phone is already spent this month.
        • shimman 1 hour ago
          Because some hostile entity might rat fuck the a slightly better system, we're destined to use the same current shitty system because something better might have a downside?

          Do you understand that this is all literally made up? The rules can change anytime and society can exert its will to make better world rather than letting a dozen people decide how technology will shape humanity (mostly in a negative capacity if you look at the current state of things).

          • pixl97 44 minutes ago
            >Because some hostile entity might rat fuck the a slightly better system,

            And make it a worse system, is what you happened to leave off.

            >Do you understand that this is all literally made up

            You mean the existing system that evolved from billions and billions of interactions? Explain what is 'made up' about it.

            The thing is if you start 'making up' random ass laws that piss people off, they will run screaming back to the billionaires to pwn them with locked down systems. Apple is a great example here. Shit is locked down and people love it.

            • shimman 36 minutes ago
              Being afraid to do things because they might possibly, but never proven, be worse is just the political machinations of enforcing the status quo where our corporate overlords get to dictate how technology shapes our lives.

              I'm sorry but that's deeply undemocratic, todays generation should have a direct say in how new things effect their lives.

              Failure to do this might literally condemn our species to extinction, and this only took less than 200 years to achieve. I'm sorry but they've proven their failure and it's time to make drastic changes.

              Good news is many people agree with this across the electorate, so now you get to decide which people you want shaping society. The previous world order of US imperialism is going to end and I rather have the people decide what to do than those that want to continue running head first into extinction.

              • pixl97 0 minutes ago
                >The previous world order of US imperialism is going to end

                I don't disagree.

                Of course Chinese imperialism probably won't be much better.

        • monarchwadia 1 hour ago
          This is a confusing comment. Interoperability and bad actors are separate concerns, because you get bad actors in systems of all kinds, not just in interoperable systems. Paywalling a system does not necessarily mitigate bad actors, either.
    • ceejayoz 1 hour ago
      It seems likely that'd result in even worse suggestions becoming the norm as people adopt the third-party that gives the quick dopamine rush. It's like suggesting tastier heroin to fix drug addiction.
      • bitwank 1 hour ago
        Certainly not. People don’t want the slop they push, the anxiety provoking, salacious, clickbaity spam that it has devolved into. Anybody that used YouTube before the last few years can tell you the difference is pretty major. This is not content people want, it’s content that maximizes clicks and ad sales.
        • ceejayoz 1 hour ago
          > People don’t want the slop they push…

          That's also true for heroin. Plenty of people really want to break the addiction.

          The slop exists because people are attracted to it.

          • bitwank 1 hour ago
            Heroin is a different business model than advertising. Respectfully, you are wrong.
            • ceejayoz 59 minutes ago
              Gosh, if you say so...
              • pixl97 42 minutes ago
                Heh, it's funny watching people, like the one above you, say "This thing is addictive because it is a real object, but this digital object cannot be addictive at all". The argument is so illogical you begin to doubt you're talking to a real person.
        • SpicyLemonZest 51 minutes ago
          People don't want to want it. But it's not obvious that merely allowing a choice of recommendation algorithms would allow people to escape the slop. Isn't anyone strong enough to choose a less addictive algorithm necessarily strong enough to not scroll Instagram for hours in the first place?
    • theptip 37 minutes ago
      I’m quite bullish on disintermediating the algorithms. AI makes it very easy to plug in your own. We just haven’t figured out the plumbing yet.

      I’d be strongly in favor of interoperability laws to pry open the monopolies.

      (One dynamic you do need to be careful about especially at first - interoperability also means IG can pull your friend graph from Snapchat, so it can also make it easier for big companies to smother smaller ones that are getting momentum based on their own social graph growth due to their USP. I don’t think this is insurmountable, just something to be careful of when implementing.)

    • JKCalhoun 38 minutes ago
      If the default algo/behavior is allowed to persist, it's going to be effectively no real change.

      Drop the algorithm altogether? I subscribe to channels for a reason.

    • saadn92 1 hour ago
      Third-party recommendation algorithms would be interesting, but I think they'd only address one layer of the addictive design the verdict is actually about. Autoplay, infinite scroll, notification timing, the variable reward patterns from likes and comments -- those are all independent of which algorithm picks the next video. You could swap in the most wholesome recommendation engine imaginable and a kid is still gonna sit there for hours if the UI is designed around endless content with no natural stopping points.
      • theptip 42 minutes ago
        I dunno, careful what you ban; TV has “infinite scroll” too.
    • heyitsaamir 1 hour ago
      Bluesky does this. In fact, the For You algorithm is a community built algorithm and way more popular than the native Discover algo.
    • Zigurd 46 minutes ago
      That's called a "feed generator" on Bluesky.
    • dmbche 1 hour ago
      Or algorithms have to be submitted and approved by a government body before being allowed to be implemented and are frequently audited
      • PokemonNoGo 1 hour ago
        I guess this is the only way. I don't think we need novel approach and I don't consider this a novel one since we already have government agencies verifying approved processes in other areas so why not content distrubution.
    • data-ottawa 1 hour ago
      How do you prevent a Cambridge Analytica exfiltration situation with third party algorithms?

      And how does this prevent addictive algorithms which will win through social selection?

      • ceejayoz 1 hour ago
        The Cambridge Analytica stuff never got fixed, it just got hidden out of sight. The situation is worse than ever now.
    • kouru225 1 hour ago
      Yes please. Algorithms should be plug-in-and-play and not endemic to the app. You should be able to take popular algorithms and plug them into any app
      • ceejayoz 28 minutes ago
        That's just laundering the bad actions though a third-party.

        The winning third party algorithm will be the one that gives people the same rush the first party algorithms currently do, because people will use it for the same reasons; they get to see cute AI animals do crazy things forever.

    • outime 1 hour ago
      Virtually nobody would choose to pay a subscription for the non-addictive app version, and I'd even say this suggestion is a bit insulting to anyone who isn't high-income.
      • bitwank 1 hour ago
        I will never pay a subscription for the current clickbaity slop. I might if the algorithm were better, closer to YouTube of 10 years ago, when it would suggest lectures, artfully done film shorts, and overall more interesting, high quality content.
    • basisword 1 hour ago
      The real solution is going back to a chronological feed of people you actively choose to follow.
  • onlyrealcuzzo 2 hours ago
    How is any app/website that 1) appeals to kids, 2) sells attention, 3) does A/B testing and/or has a self-learning distribution algorithm NOT guilty of this?
    • guzfip 2 hours ago
      It probably helps when you suppress research that shows you’re harming children and allow human traffickers to fester on your platform with 17 warnings or whatever.
    • KaiserPro 1 hour ago
      I think there is a fourth portion that is probably more important:

      Actively ignoring harm caused by your product. TV/radio has sold attention, but there were pretty strict rules on what you can/can't broadcast, and to whom. (ignoring cable for the moment) Its the same for services, things that knowingly encourage damaging behaviours are liable for prosecution.

    • sampullman 2 hours ago
      I think there's a little more nuance than that, but it seems roughly correct.

      Wouldn't it be better if apps/websites targeting kids didn't use A/B testing to be more addictive?

      • KaiserPro 1 hour ago
        I think addiction is a redherring.

        Pokemon is addictive, computer games are addictive. Its whether they are knowingly causing harm, and or avoiding attempts to stop that harm.

        • Zigurd 37 minutes ago
          Addictive patterns in games and other online activity is a bit less innocent than you are portraying it: knowingly causing harm is too low a standard. A lot of the profitability of online games, prediction markets, etc. comes from the whales. The whales are probably addicted. If your business is a whale hunt you are possibly causing harm at least to the extent that addiction is dangerous.
      • ramon156 2 hours ago
        They'd find another method. Why are we allowing this in the first place?

        I don't have an answer to fix this whole mess, but it starts with our attitude towards addiction. We've built a system that rewards addiction in all sorts of places. Granted, every addiction is different, and I'm of the opinion that it's not (drug = bad), it's how you use it and react to it. We can control the latter, but we choose to ignore it because we're too busy with anything else. This is a tale as old as time...

        • aaomidi 1 hour ago
          In the span of how long it takes for law to catch up to what’s going on, YouTube and Facebook has been around for a tiny amount of time.
          • bluefirebrand 1 hour ago
            They have been around long enough to have done unknowable damage to entire generations of humans
            • aaomidi 1 hour ago
              As usual unfortunately laws are reactive.
        • ToucanLoucan 1 hour ago
          > Why are we allowing this in the first place?

          Exactly what I keep coming back to.

          For me, it feels like you could cut this problem down substantially by eliminating section 230 protection on any algorithmically elevated content. Everywhere. Full stop.

          If you write or have an algorithm created that pushes content to users, in ANY fashion, that is endorsement. You want that content to be seen, for whatever odd reason, and if it's harmful to your users, you should be held responsible for it. It's one thing if some random asshole messages me on Telegram trying to scam me; there's little Telegram can do (though a fucking "do not permit messages from people not in my contacts" setting would be nice) but there is nothing at all that "makes" Facebook shovel AI bullshit at people, apart from it juices engagement, either by genuine engagement or ironic/ragebaiting.

          And AI bullshit is just annoying, I've seen "Facebook help" groups that are clearly just trawling to get people's account info, I've seen scam pages and products, all kinds of shit, and either it pisses people off so Facebook passes it around, or they give Facebook money and Facebook shoves it into the feeds of everyone they can.

          It's fucking disgusting and there's no reason to permit it.

          • SpicyLemonZest 45 minutes ago
            Eliminating section 230 protections would heavily disfavor any kind of intellectually stimulating content, because it's hard for a platform to scalably verify that nobody's making defamatory claims. But pointless clickbait, heavily filtered Instagram models, etc. don't really have liability concerns on a video-by-video level. To me it seems like this makes the problem worse.
      • schmidtleonard 2 hours ago
        > more nuance

        Not enough to diffuse liability. 15 years ago when recommender algorithms were the new hotness, I saw every single group of students introduced to the idea immediately grasp the implication that the endgame would involve pandering to base instincts. If someone didn't understand this, it's because

        > It is difficult to get a man to understand something, when his salary depends on his not understanding it. - Upton Sinclair

      • steve-atx-7600 2 hours ago
        For context, facebook is so dystopian when I login once every few years that I’m not sure I’ll ever use it again. And, I hate wading through the YouTube cesspool to find some educational content I like. But, I don’t think it makes sense to ban a/b testing or optimization in general. Some company could use it, for example, to figure out how to teach math to kids in a way that’s as engaging as possible. This would be “more addictive” technically.
        • sampullman 1 hour ago
          That's a good point, I'm not 100% sure it's worth throwing away the potentially beneficial uses. There might not be a solution that's both feasible to implement and avoids banning useful things. In the end I usually come back to it being the parent's responsibility to monitor usage, limit screen time, etc., but it hasn't been working so well in practice.
    • steve-atx-7600 2 hours ago
      How’s this different than tv that a kid might see that has ads and programming targeting kids?

      I watched 80s horror movies when I was in elementary school and had nightmares for years. Should I sue now?

      How about parents be held responsible for how they care for their kids or not? Maybe a culture that judged parents more strongly for how they let their kids spend their time would be an improvement.

      • everdrive 2 hours ago
        Being able to find some basis for comparison between two things does not render them equivalent, and this is an extremely frequent fallacy I see with regard to technology discussion on HN.
        • parpfish 1 hour ago
          When it comes down to it, I’m not sure how you differentiate an “addictive” product from a well-made product that I choose to keep using.

          When people say that Tetris and Civilization are “addictive” they aren’t implying anything malicious about the development, it’s more of a compliment about the game (and maybe a little lament about staying up too late).

          But the addictive nature of social media feels different and I can’t figure out what that distinction is.

          • card_zero 1 hour ago
            People will now say "the algorithm" and "dopamine", explaining nothing. You see, social media is truly addictive because it's been honed to be addictive in some way that isn't specified or known or actually true.

            OK, let me try to analyze it:

            1. Humans are idiots.

            2. We have idiot glitches where we obsess over some particular thing. This is our own business and our own fault, and is impossible to tease apart from just liking stuff a lot and benefitting from it.

            3. These glitches tend to accumulate in certain areas, and then some companies find themselves in the position of profiting from human glitchy idiocy, even though they didn't want to be behaving like scammers.

            4. Then some of them get cynical about it and focus on that market segment, the obsessed idiots. This can include gambling and social media.

          • prewett 1 hour ago
            Not to disagree with you, but in the case of Civilization, I do find it addicting in both senses. It is one of two games that I just cannot play, because I will be up until 3am playing. (Puzzles and Dragons was the other one, I think I had to uninstall it the day after I downloaded it)
            • pixl97 39 minutes ago
              Oh, not Factorio. I guess Factorio might be slightly less addictive than crack because I was eventually able to put it down.
          • genthree 1 hour ago
            I have an instagram account because it's by far the best way I know of to keep up with various small businesses, local or otherwise, that I like.

            What I go into the app to do: see if there are any updates from those businesses.

            What the app presents me on launch: a bunch of nonsense selected for what will best-distract me. And you know what? Sometimes it does catch my attention for a minute or two!

            What the app doesn't let me do: disable the nonsense, or even default to the tab of accounts I'm following. Hell they even intentionally broke ways to achieve this with iOS' scripting, you'd think that'd be niche-enough they wouldn't care, but apparently enough people were doing it that they bothered to break it.

            The algo feed is addictive on-purpose. I would turn it off if I could, and there's a damn good reason they don't let you do that. I "choose" to engage with it sometimes, which sometimes gets people coming out to go "oh-ho! So your revealed preference is that you like the feed!" but that's plainly silly, as that's highly contextual and my in-fact actual preference would be to never see that feed again in my life, and in fact I've spent a little time trying to make that happen. It's only my "revealed preference" in a world where I've had to compromise by occasionally losing a couple minutes to this crap because the app won't let me go straight to what I actually want. That's my true preference, the "revealed" one is only ever briefly flirted-with in a context in which I'm prevented from attaining my actual preference.

            Consider a person who struggles with eating junk food. They don't keep junk food at home, in fact. That is their preference, to not keep it around, because they don't want to eat it and know they will if it's there. Now concoct some scenario in which, in exchange for something else they want, they have to take delivery of a couple bags of potato chips and a box of cookies every week. And sometimes, they eat some of that before tossing it out or giving it away! "Ah-ha, so their revealed preference is that they want junk food!" Like, no, of course not.

            There's a reason these apps have to prevent you from using any part of them except with the presentation they like: because they'd being addictive on purpose, and tons of users do not want the addictive parts, at all, but do want other parts.

          • everdrive 1 hour ago
            I think this represents a strong misunderstanding of what addiction is, and how it works. I mean this respectfully, and not combatively -- I expect you have never had problems with addiction.

            When it comes to behavioral psychology research, there is a strong understanding of concepts such as behavioral reward schedules; interval-based rewards, time-based rewards, variably-interval-based rewards. People have a very clear understanding of what sort of stimulus is and is not prone to addiction. You can get a mouse in a cage to become hopelessly addicted to pressing a lever for a reward depending on what reward schedule you use, and this does not translate to a mouse who can just get the reward at a regular interval. (or perhaps merely a less-addicting interval) The mouse in the cage pressing a button set to a variable-ratio reward is equivalent to an old person using a slot machine in a very literal and direct way. This also translates to social media with permanent scrolling. So many of the stories such, but the variable interval is the extremely enticing (or enraging) story that just might be the next one.

          • close04 1 hour ago
            > Tetris and Civilization are “addictive” they aren’t implying anything malicious about the development, it’s more of a compliment about the game

            Because it's a figure of speech, not a clinical diagnosis. Literal and figurative addictions are different beasts.

            Intent, premeditation, scale are major differentiators. When they know they will cause harm, they concentrate and fine tune it for the effect, turn it into a firehose, and target it at specific individuals it's very, very different from what random ads, games, of movies do. These companies literally designed their products with the intent to make them addictive and target children, knowing the full implications and ignoring the harm they caused.

            You're comparing a drug dealer who only sells to kids to a store clerk who also sells icecream to kids. It doesn't take more that scratching the surface to realize the similarity is very fleeting.

        • steve-atx-7600 2 hours ago
          I understand what you’re saying, I personally don’t like or use social media, but I don’t agree that these companies are at fault after reading this article and others. I’d rather be wrong and learn something than think I’m right, so I welcome further criticism.
          • everdrive 1 hour ago
            I agree with you that parents need to ultimately be responsible for keeping their kids off social media. I think there are a few problems here:

            - Social media is still somewhat new, and the broader public is only now discovering that it's a clear net negative both personally and for society. Because this is such a new realization, I think a LOT of people have not really figured out how this problem should be dealt with. (both personally, via social norms, but also with regard to laws and regulations.

            - No matter how awesome of a parent you are, 100% of your kids friends will have social media and they will introduce it to you kid. That may do less harm than if they have it themselves, but some harm will still be done.

            - There are network effects to consider. It's true that it's your personal fault if you use cocaine -- however we also understand that cocaine is so addictive that it really cannot be used safely. Social media is metaphorically the same. It's a personal failing if you're a social media addict, however broadly almost everyone is susceptible to it. In my mind, that is an argument for regulation.

            Now that said, I have zero faith that our government can actually build sensible regulation here.

          • F7F7F7 1 hour ago
            They strategically use patterns that directly trigger the release of dopamine into the brain.

            They've created algorithms that use slot machine like experiences that keep kids hooked to the screen.

            These algorithms feeds users barely moderated content that feeds their worst instincts. With almost surgical precision when wanting to illicit engagement.

            Then when research shows them the harm their causing they bury it, hire lobbyist, and double down.

            Switch out a few words up there and you have the big tobacco playbook.

        • card_zero 1 hour ago
          Right, like social media and addictive drugs for instance.
      • kspacewalk2 1 hour ago
        Parents ought to be held held responsible for how they care for their kids. This isn't just true of their use of social media and devices, but also when it comes to teaching them to look both ways when crossing the street; making sure they understand the concept of private parts, consent and personal space; making them understand the dangers of alcohol, and many other things.

        Does any of that obviate the need for safe urban design, anti-CSAM and anti-molestation laws, or laws prohibiting the local dive from serving a cold one to my 11 year old? Will simple appeals for "parental responsibility" suffice as an argument for undoing those child safety systems we put in place, or will they be met with derisive dismissal? Why should your "solution" be treated any differently? In fact you offer none. Yours is the non-solution solution, the not-my-problem solution, the go-away solution. Not good enough on its own, sorry.

      • roxolotl 2 hours ago
        Both things can be true. Parents can share responsibility. But it is also the case that Facebook actively suppressed research that showed that children using their platforms experience emotional harms. It is also the case that around the time you were in elementary school discussions about children’s programming had been ongoing for years and eventually regulations were put in place[0].

        0: https://en.wikipedia.org/wiki/Regulations_on_children's_tele...

        • steve-atx-7600 1 hour ago
          I can agree that I think they acted to harm society knowingly. I used to think regulation could help and maybe it can, but if there were some way to shape the culture to value, for example, educational tv programming, I think that would be the most powerful influence on tech/media companies. Regulation could serve to inform parents “this programming/platform is known to rot your kids mind” like a nutrition label and some day hopefully parents will be more likely to disallow it like some do knowing how much sugar is in sodas.
      • mrweasel 50 minutes ago
        > How’s this different than tv that a kid might see that has ads and programming targeting kids?

        It's not, that illegal as well. You cannot target kids with TV advertising.

      • ceejayoz 1 hour ago
        > How’s this different than tv that a kid might see that has ads and programming targeting kids?

        Those ads didn't adjust themselves on a per-child basis to their exact interests.

      • jeffbee 2 hours ago
        The difference is largely in the way that the legal caste perceives themselves to be aligned with media but opposed to tech.
    • parpfish 1 hour ago
      A/B testing is one way to make things “addictive” but you can also make addictive products without it.

      A really good designer could make a highly engaging app or an editor can write clickbait headlines all with without testing.

      • esafak 1 hour ago
        These products maximize revenue through engagement with advertisements. The outcome is built into their business model.
    • everdrive 2 hours ago
      Correct, selling attention inevitably leads to harm.
      • wffurr 2 hours ago
        As a parent, the only solution is sticking to ad-free subscription services. PBS is a godsend here, but there's other good options out there too. Tragic that the public broadcasting funding was cut when there's clear harms in the free* commercial options.

        *Except for your time and mental health of course

        • everdrive 2 hours ago
          Agreed. Libraries have books and DVDs, and you have things like the classical stations. You also have playgrounds and walks in the park, etc. (I'm also a parent of two young children.

          Always doing wholesome stuff with your kids is certainly not easy or trivial, but there is a cascading effect here. If your child does not expect to be able to just watch TV all the time it's easier to keep them interested in other things. Once that expectation is burned in you'll be fighting it for a while. And once that expectation is burned in, a small child will _never_ say "I've had enough youtube, I don't need any more."

          So I really don't want to be self-righteous about always doing wholesome stuff with your kids (we definitely do not succeed 100% of the time) -- but rather point out that letting them use addictive media has negative, cascading consequences that actually do make it harder for you as a parent. It's analogous to drinking to relax. You get relief now, and pay for it later. Not actually a good tradeoff much of the time.

    • embedding-shape 2 hours ago
      I guess ultimately it depends on if the app/website authors do so "negligently" or not.

      > Jurors were charged with determining whether the companies acted negligently in designing their products and failed to warn her of the dangers.

      So if you do so while providing warnings and controls for people, that might make it OK in the eyes of the law?

    • SecretDreams 1 hour ago
      Because most are just no where near as good and effective at ruining a kid's mind as meta. If others were as good as meta at destroying whole generations of cognitive development, they'd probably also be liable.
    • SirFatty 2 hours ago
      algorithm would be the key word I think.
    • DavidMcLaughlin 1 hour ago
      A/B testing is very, very different to handing over control of your content to a reward function that optimizes for time spent over any other criteria.

      We had 10 years+ plus of having products like Facebook, Twitter, YouTube, hell even LinkedIn with a basic content model of "you build your own graph of people who you pull content from" and their job was to show it to you and puts ads in there to fund the whole enterprise. If I decided to follow harmful content? That was a pact between me and the content creator, and YouTube was nothing more than a pipe the content flowed through. They were able to build multi-billion dollar businesses off of this. That's really important, this was enormously profitable. But then the problem happened that people's graphs weren't interesting enough, and sometimes they'd go on the thing and there were no new posts from people they followed, and this was leaving money on the table. So they took care of that problem by handing over control of the feed to the reward function.

      More accurately, especially for Meta products: they completely took control away from you. You didn't even have the option to retain the old, chronological social graph feed anymore. And it was ludicrously profitable. So now the laws of capitalism dictate that everyone else has to follow suit. I now have extensions on my browser for Instagram and YouTube to disable content from anything I don't follow - because I still find these apps useful for that one original purpose they had when they blew up and became mainstream. Why are these browser extensions? Why can't I choose to not see this stuff in their apps? That's the major regulation hole that led to this lawsuit, imo.

      It's the same thing you see with people blaming smartphones for brainrot. We've had 15 to 20 years of smartphones with more or less the same capabilities as they have today and for the vast majority of that time my phone didn't make books less interesting or make me struggle to do chores or manage my time. For a full decade or more I saw my phone as a net positive in my life, was proud to work for Twitter and generally saw technology like the Louis CK bit about the miracle of using a smartphone connected to WiFI on an airplane. But in the last five years or so, things have noticeably and increasingly gone to shit. Brainrot is a thing. All my real life friends who are the opposite of terminally online or technical are talking about it. I don't use TikTok but it seems like that is absolutely annihilating attention spans. The topic of conversation over drinks is how we've collectively self-diagnosed with ADHD and struggle with all kinds of executive function.. but also are old enough to remember a time when none of this existed. Complete normies are reading Dopamine Nation and listening to Andrew Huberman trying to free themselves.

      I don't know what the exact solution is, but there's at least a simpler time we can point to when we all had smartphones and we were all connected via platforms and we all posted and consumed stupid pictures of each other and it wasn't.... _this_.

      • onlyrealcuzzo 1 hour ago
        Great point RE the self-learning algorithms. That's what I intended originally, but didn't communicate clearly.
      • thin_carapace 1 hour ago
        regarding brain rot, short form content is absolutely going to be the root physical cause - people could tolerate smartphones prior to the inception of short form content. on a cultural level, this level of destruction could be compared to the effects of a coordinated and targeted attack from enemy nation states - if not for the fact that we did this to ourselves in the name of profit. one can only hope that the old guard wakes up to systematically handle this issue that we have no familiarity with, otherwise our system will buckle under the pressure of 10-20 years worth of nonfunctional humans. i do find a technocratic dystopia far more likely, considering the aforementioned mentally castrated opposition ... hows a generation of kids going to win against trillions of dollars of zuckerberg 'engineering' steering them since birth? shame on the 'engineers' who engendered this mess, shame on their shepherd 'managers', and shame on the sociopaths at the top.
  • Hobadee 2 hours ago
    Is the addictiveness of social media great? No. But the blame shouldn't be placed squarely on the companies either. What happened to personal responsibility? I was addicted to Facebook, I realized it, and I disconnected from it. I had withdrawals for a while (pulling out my phone and trying to open the app I had deleted without really thinking about what I was doing) but I quit. I know I am addicted to YouTube shorts, so I stay away from them. Occasionally I'll go on a bender and a few hours will slip by without me realizing, but while I know YouTube is designing them to be addictive, I blame myself for falling for it.

    There are plenty of things in life that can be addicting; drugs, sex, money, power, adrenaline, entertainment, technology... The list goes on. If we remove everything addicting from life, you better believe something else will rise up to take its place.

    The solution therefore isn't to remove everything addicting from life, but rather to raise everyone with the forethought to know what might be addictive, the self-awareness to realize when you are addicted to something, and the self-control (and support systems if and when necessary) to stop.

    • scottious 30 minutes ago
      Personal responsibility is important. But at the same time, we don't let people open up a heroin shop and then claim it's your personal responsibility to not buy it and use it. We don't put slot machines in schools but tell kids that they need self-control to not get addicted to gambling.

      I don't know what the answer is, but it feels wrong to lean _entirely_ on personal responsibility. We live in a world in which we were simply not evolved to live in. People literally make a good living by engineering and exploiting our weaknesses for profit.

      > raise everyone with the forethought to know what might be addictive, the self-awareness to realize when you are addicted to something, and the self-control (and support systems if and when necessary) to stop

      If only it were that easy. If you've ever known somebody who struggles with a serious addiction you'll know that even when they know it's destroying their life they still can't stop.

    • simonh 2 hours ago
      The problem is that internal communications inside these companies raised concerns about the manipulativeness, and even deceptiveness of the algorithms and tactics they were using.

      They weren't just consciously creating an attractive platform, they were consciously creating a manipulative platform.

    • ddoolin 1 hour ago
      Maybe this applies more towards adults, but I don't think the correct answer for kids is only "just have self-control," something kids are notorious for not having. Certainly there's a lot of parental responsibility here but we can simultaneously hold companies responsible for their part too.
      • ValentinPearce 1 hour ago
        It also is a situation where the ubiquity of these companies make it exceptionally difficult for parents to regulate access.
        • freshtake 50 minutes ago
          This. Also, technology is ever changing, and expecting parents to constantly keep up with feature rollouts on these platforms is unrealistic.

          Personal responsibility IS important, but we also don't allow cigarette companies to advertise on billboards with cute characters (remember Joe Camel?)

    • nkrisc 1 hour ago
      Yes, personal responsibility is important. That doesn't mean we need to allow companies to attempt to addict as many people as they can.

      The question we should be asking: are these technologies a net-positive to society?

    • ValentinPearce 1 hour ago
      If they are liable of making the thing addictive, it does mean it is their fault. In this case, it specifically says it's designed to be addictive to children, whose personal responsibility is probably not expected.
    • CarVac 1 hour ago
      We can't raise other people. We can prohibit the addicting things like newsfeeded Facebook.
    • pearlsontheroad 1 hour ago
      Everyone should at least be a conscientious junkie.
    • imiric 1 hour ago
      On one hand: sure.

      On the other, it's very different when companies explicitly design their products to be as addictive as possible.

      We've been through this with Big Tobacco already. Nicotine and other tobacco substances are addictive on their own, but tobacco companies were prosecuted for deliberately making cigarettes as addictive as possible, besides also marketing to children. The parallels with Big Tech and social media are undeniable.

    • beepbooptheory 1 hour ago
      Don't blame yourself! You had an encounter in the world and were greatly affected. Anyone who had the same predisposition and same exposure as you would of fallen in the same situation, just as they would have pulled themselves out of it the same way.

      It is not, like, a moral thing to become addicted to something. And the ability to pull yourself out of it is determined, whether you are conscious of it or not, by your broader circumstances and by the same predispositions that brought you there in the first place. At the end of the day we are all fucked up animals reeling from the ongoing consequences of prematurational helplessness..

      We should feel together in our problems like this, not distinguish ourselves by how we might individually overcome them. You are not "better" finding yourself standing over a beggar addict, you are lucky, never forget that. If for no other reason that it's not a sustainable world view otherwise, it leads to insecurity, anger, and relapse.

      The dark truth of the world is that everyone is doing the best they can. How could they not? Why would they not? What is this thing that separates you from the addict or murderer? Unless you have maybe some spiritual convictions, I can't imagine what it is..

      Just really, I know you had a powerful personal journey, but don't let it establish to you that we are all fundamentally alone, because we are not, and its good to help people who maybe need more help.

  • freshtake 55 minutes ago
    Short form video is a different beast altogether, and much more concerning. The fact that these platforms don't offer a way to avoid short form altogether is a big issue.

    YouTube allows you to "show fewer shorts" but what if you don't want them popping up at all?

    AI Slop is the best thing to happen to these platforms - because it will lower trust and engagement as people (hopefully) become tired of inauthenticity. Rage bait is potent when the event in the video _actually_ happened, but when you realize it was AI generated, the manipulation feels even more obvious (though it was always there).

    These platforms should also allow users to understand how the algorithm has categorized them, and be able to configure it. YouTube, Instagram, et al. would be safer places for viewers if they allowed users to tell them what they want to be exposed to, and what they don't. Big tech is dodgy about this currently, because the more control the user has the lower the engagement (good for the user, bad for profit).

    • malfist 13 minutes ago
      That "show fewer shorts" button doesn't do a damn thing. I click it, refresh the page and whala, shorts.
  • absoluteunit1 1 hour ago
    Oh man if they think YouTube and Instagram are addicting they should see what Roblox does lol
  • paulkon 2 hours ago
    Just needs a health warning label, like on alcohol or cigarettes. Then onto the high sugar products, and a quarter of the grocery store
    • ddoolin 1 hour ago
      If we want to compare it to alcohol/cigarettes, then kids shouldn't be allowed to use this either.
      • pearlsontheroad 1 hour ago
        and the government should tax it accordingly
        • ibejoeb 5 minutes ago
          I don't think that you can practically expect to tax speech.
    • alexlesuper 2 hours ago
      We have health warnings for food that contains lots of sugar, fat and/or sodium in Canada
  • nottorp 1 hour ago
    Just kids? Not adults?
  • bknight1983 1 hour ago
    When you put something out there, there's a question of ownership for how people end up using it. - Some think that "if you use it incorrectly, it's your fault" and probably agree with the statement that Palantir is not an evil software and that one must "change the administration". - Some think that "if you use it incorrectly, it's the creator's fault" and then you have safety labels on everything (see Prop 65).

    It's a spectrum of risk between the user and the creator. My opinion is that there's enough scientific evidence that social media to show that it has a negative impact on kids and teenagers as their brains are still developing. I think a social media ban on kids is a good thing (similar to a driver's license or age of drinking).

  • techteach00 43 minutes ago
    I gotta be honest. I saw the photo of the plaintiffs as the jury decision came back. They looked exactly like someone who just won the lottery. Philosophical or moral displays of victory look different.

    I believe the plaintiffs solely care about becoming millionaires. No concern for how these rulings will further erode user privacy/rights online.

    • jjice 42 minutes ago
      Is that completely based on their expressions and reactions? I mean, you might be right, but I feel like an expression of reaction is too little to base such a damning statement on.
    • IncreasePosts 9 minutes ago
      I thought the same thing. I took solace in the fact that it may be appealed, and that I suspect lawyers and taxes will take a large chunk out of the settlement
    • post-it 40 minutes ago
      Body language analysis of strangers is bunk pseudoscience and a great way to reinforce your prejudices.
  • GardenLetter27 1 hour ago
    Mandatory age verification is coming.
    • _kidlike 1 hour ago
      my thoughts exactly... this "verdict" came with very suspicious timing.
    • highstep 1 hour ago
      otherwise know as mandatory identification
    • 2OEH8eoCRo0 43 minutes ago
      Good. Long overdue
  • ramesh31 2 hours ago
    I've heard about "landmark" cases against these companies over and over again for the last decade. There seems to be at least one every couple of years. And yet literally nothing has ever happened or changed.
    • petcat 2 hours ago
      Since these are civil lawsuits, it just takes more people coming forward to sue. There are plenty of cases where a jury found a defendant liable for damages only for the defendant to continue the bad behavior and subsequent juries awarding ever-increasing and compounding punitive damages. Big Tobacco and Purdue Pharma (went bankrupt) are examples of this pattern. Monsanto was famously hit hard with massive "repeater" damages after they continued selling and marketing Roundup despite prior judgements.

      The exact same can happen to Big Tech. The goal is to get them to stop the bad behavior now.

    • mrbluecoat 1 hour ago
      I feel the same way. They're just going to appeal the case until they find a layer of the legal system where they have leverage.
  • ChrisArchitect 1 hour ago
  • yacin 2 hours ago
    this has to be the first of many right? fingers crossed this leads to some meaningful change.
    • jeffbee 2 hours ago
      You mean it's the first of many appeals, I assume.

      Trial courts will decide pretty much anything. Then the case gets appealed over whether the trial court correctly interpreted things you probably perceive as uncomplicated, like the 1st Amendment.

    • 2OEH8eoCRo0 2 hours ago
      It's a huge deal because it was the bellwether case for over 1,000 other similar cases.
      • yacin 2 hours ago
        ah yup:

        > It comes on the heels of a Delaware court decision clearing Meta’s insurers of responsibility for damages incurred from “several thousand lawsuits regarding the harm its platforms allegedly cause children” — a ruling that could leave it and other tech titans on the hook for untold future millions.

        • trollbridge 2 hours ago
          Yep. The insurance covers accidents and negligence, not deliberate decisions to impose harm to children for financial gain.
        • guzfip 2 hours ago
          Sounds too good to be true. I’ll hold my breath.
        • AlienRobot 2 hours ago
          I wonder at which point do children become such a liability for platforms that it's easier to just ban all children altogether.

          Children don't have disposable income to buy ads/subscriptions. They don't have experience to write about. The only thing they have that adults don't is time which translates into engagement metrics.

          In an ideal world, the adults that buy/manage the computers would create age-restricted account for children, and the OS would give this information to the browser, which would just transmit it via HTTP. This is the safest method to verify ages. If an operating system doesn't want to support this, it's ultimately the adult's responsibility to install one that supports it. This would mean there would be no burden on the adults (the majority of the planet) to verify their ages, so there would be no burden on the platforms to restrict ages either.

          If platforms could verify ages without inconveniencing their main user base, I wonder if platforms would just start banning all minors, or if there is some reason to allow minors in the platform that justifies all the liability surrounding them.

          • WarmWash 1 hour ago
            Children are an extremely valuable ad target.

            They have their hands directly on their parents heart strings, and their parents have a credit card.

            This isn't anything new, think about the toy ads we had on TV when we were young.

            • AlienRobot 31 minutes ago
              I guess you are right. I assumed that something like Youtube Kids would have no ads at all given the audience, but it seems it does have ads targeted at young children. Bleak world we live in.
          • germinalphrase 1 hour ago
            Nobody takes “age-restricted account[s] for children” seriously.

            Parental controls and age-restrictions are almost universally half-baked, buggy fig leafs to displace negative attention from software and content providers.

  • topheroo 36 minutes ago
    They were also designed to addict adults, just saying.
    • AnimalMuppet 30 minutes ago
      Right, but adults are assumed to be somewhat more responsible for themselves. This is why we don't let kids (legally) smoke or drink, but we do let adults do so. We expect that adults can, in general, say no, and that children are less able to do so.

      But it's not absolute. Some drugs are illegal for adults as well, for example. Why? Because they're too addicting.

      So are Instagram and Youtube just nicotine, or are they heroin?

  • pautasso 2 hours ago
    Everyone now posting on social media about how the sentence "Social Media is Addictive" is going viral.
  • nlarion 1 hour ago
    There is no personal responsibility left in America. I have a child. It's my job to teach him and watch what he watches and does. I guess I am the only one who thinks this way. Good luck having the parental government raise your child. Parody: I let my child have cocaine and now they're addicted!!!!! Hilarious.
    • freshtake 30 minutes ago
      How old is your child? Younger than 6-8 it's easy to monitor what they're watching and enforce limits. By age 9-10 it isn't just about what they access in the home. Many schools in America are giving kids computer and tablet access, and kids are smart or curious enough to access social media there.

      I agree that a big part of this is educating children about these hazards, but that also doesn't mean we should allow these companies to data science the shit out of our attention and will power. Many adults have concerning relationships with social media too -- exposure, pressure, and manipulation are key ingredients that are difficult for anyone to deal with.

    • dj_gitmo 24 minutes ago
      > Parody: I let my child have cocaine and now they're addicted!!!!! Hilarious.

      Cocaine is illegal because it is addictive.

      • IncreasePosts 6 minutes ago
        LSD and hallucinogenic mushrooms aren't addictive and aren't legal. Cigarettes and alcohol are addictive and are legal.
  • baggy_trough 1 hour ago
    Doritos now liable for creating a good tasting chip? This is madness.
    • Ajedi32 1 hour ago
      Yeah, people keep making the comparison to cigarettes but to me this is wildly different.

      Cigarettes directly cause physical harm and even death. Social media can sometimes, under certain circumstances, depending on who exactly you're interacting with on social media, indirectly contribute to emotional harm.

      Cigarettes are also physically addictive. Your body actually becomes dependent on them and will throw a fit if you try to stop using them. Social media is only "addictive" in the loose sense that all fun, mentally engaging activities are.

      I'm not saying social media is fine for kids and we shouldn't do anything to reduce their use of it (TV and video games can be equally unhealthy IMO). I'm not even necessarily against legislation on the subject. But there's a huge difference between fining a company for breaking a law, and fining them for making a perfectly legal product "too fun" because you let your kids spend all their time on it and that turned out to be unhealthy.

      This type of civil litigation where the courts effectively create and enforce ex post facto laws based on their opinion about whether perfectly reasonable, 100% legal actions indirectly contribute to bad outcomes is not a great aspect of our legal system IMO.

      • freshtake 22 minutes ago
        There are different kinds of addiction. The difference is physical vs. mental.

        The best example of this is heroin, which has both a severe physical and mental addiction component, and it's the mental addiction that makes relapse so common.

        Mental addictions rewire the brain's chemistry, causing the user to seek and only find joy in the substance. This is a better comparison for social media (albeit not as destructive and instantaneously harmful as narcotics)

        • Ajedi32 6 minutes ago
          Everything you do or even just think about "rewires" your brain to some extent. The difference with addictive drugs is that they do so in a way that bypasses your brains' natural processes. The same cannot be said for "addiction" to games or social media, or other entertainment.

          There can still be social ills associated with these forms of natural "addiction" (e.g. gambling), and I'm okay with regulating those ills, but I'm less okay with the courts doing so unilaterally based on their subjective opinions with no concrete law backing them up.

    • OptionOfT 1 hour ago
      One could argue that the ultra processed food industry is doing exactly what the tobacco industry did wrt to making their food addictive.

      There is a difference in creating a food that tastes good vs creating a food that tastes good, but instantly wants you to eat the whole bag.

    • bknight1983 1 hour ago
      Normally I don't see people walking down the street staring at their Doritos
    • bogdanoff_2 1 hour ago
      addictiveness != enjoyment

      Although to some extent they're correlated, sometimes the things that are most enjoyable you wouldn't describe as "addicting" and vice-versa.

      Eating a nice full meal is more enjoyable than eating doritos on your couch, but you wouldn't describe it as addicting.

      If anything, I find my experience of youtube today to be less enjoyable than in the past

  • gervwyk 1 hour ago
    now do Candy Crush..
  • superkuh 1 hour ago
    It's amazing that a jury of people completely ignorant of what medical addiction is managed to make this discovery despite thousands of scientists around the world being unable to confirm this hypothesis. Which is to say: this is extreme bullshit which has nothing to do with reality or science or empirical study and instead is based entirely on feels and popular memes about "dopamine hits" (no basis in reality).
  • robinanil 1 hour ago
    [dead]