21 comments

  • jrflo 2 minutes ago
    I heard someone on a podcast call social media algorithms "the modern-day cigarette" and that really resonated with me. These companies know their product is addictive and bad for users, but they keep pushing it anyways. Like cigarettes, it's bad for everyone, not just kids. I made an algorithm blocker (https://festudio.net/scrolless/) for Safari because of that and it's actually crazy how much more pleasant social media is if you don't have recommendation algorithms at all. I think the EU and other jurisdictions should really look beyond just limiting this stuff to kids, but I understand why it's starting there...
  • conception 1 hour ago
    This is pretty easy to solve. If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present. If the user decides you don’t, ala social media 1.0.
    • grumbel 0 minutes ago
      Alternative suggestion: Force them to open up the service and allow third party clients. Take Art. 20 GDPR "Right to data portability" and extend it to public content.
    • akersten 3 minutes ago
      How does this specific horrible take rank so highly on HN whenever something adjacent to big tech gets posted. "impartial common carrier" is not even an extent legal concept.

      It's been argued to death already, I just have to express shock that I'm still seeing this non-starter constantly here.

    • schnitzelstoat 37 minutes ago
      So the user opens the app - what is the first video you show them? How does 'the user decide' from the millions upon millions of videos there are?

      If the user can search like in Youtube then how do you rank the results? That's also an algorithm.

      It isn't pretty easy to solve at all.

      • alkonaut 32 minutes ago
        In the case of Instagram: You show the videos from the people you follow on instagram, then no more short videos at all. Possibly a search box.

        If you search on youtube then it can rank any way it wants, just not use e.g. anything from the viewing history. No "related videos" column. That's what YouTube used to be. But YouTube (unlike TikTok) worked well before it had rabbit holes.

        For TikTok the situation is worse. Their whole app just doesn't exist unless you have the custom feeds. This would make YouTube be 2010 youtube, Instagram be 2010 Instagram (great!) but it would effectively be a ban of TikTok's whole functionality (again, great!).

        • Aurornis 10 minutes ago
          I think it would be great if all of these apps had an option to function like you propose: Your feed is a simple view of people you’ve chosen to follow. The end.

          Then all of the people who have trouble with self-control on infinite feeds can enable this mode, and everyone who wants the recommendation algorithm can leave it on.

          This is the optimal outcome that actually serves everyone’s personal goals for using these platforms. If we get into a conversation where some are demanding we don’t allow anyone to use a recommendation algorithm because they feel the need to control what other people see, that’s a different conversation. That conversation usually reveals other motives, like when people defend the algorithm sites they view (Hacker News, Reddit, whatever) but targets sites they don’t like TikTok.

        • HPsquared 16 minutes ago
          There is no going back to the 2010 internet unless you confiscate everyone's phones.
          • alkonaut 1 minute ago
            Not sure what confiscation would accomplish that regulation couldn’t? I mean we’re all aware that if regulators target TikTok then a new app would pop up and take its place.

            But the thing about regulation is that it doesn’t need to be water tight. You can just target a small handful of large players and it will improve the situation in practice. It doesn’t matter if 998/1000 apps use addictive feeds if the largest two apps don’t and they have 90% of users/views.

      • butlike 9 minutes ago
        Do it like a library. When a person walks into a library, they're presented with a short curated list of books suggested from the librarian. All visitors to the library see the same books. From there, the visitor can go about their business searching for what they want.

        If they don't know what they want, perhaps a good use case for the newfangled LLM-search we have now would be "What's an interesting or popular topic I haven't searched for before?" to which the AI will respond with a list of newly searchable terms.

      • denismi 26 minutes ago
        The first unwatched video from the user's followed/subscribed channels. Chronological, reverse chronological, sorted alphabetically, by the user's channel prioritisation, by likes, by views... whatever the user chooses. And then an end of feed.

        For new users? A search bar and a set of (human? AI?) curated seed recommendations that the platform is comfortable with being held liable for.

      • noprocrasted 30 minutes ago
        > what is the first video you show them

        Whatever is latest posted across their followings/subscriptions?

        • vasco 9 minutes ago
          If they just signed up they have no followings or subscriptions. So now what, you need to show accounts to follow first? Thats the same problem as deciding what the first video to show is. How do you decide who they should follow? Or the vision is that you can only have friends as if it's 2005 and you can't discover anything serendipitously?

          I don't consume any content from my friends on something like tiktok where I'm interested in discovering people that have good content under topics I'm interested in. I don't know who those people are and I want to discover new ones that come up not just follow some already popular accounts.

        • hug 13 minutes ago
          That’s also an algorithm. An unsophisticated one, but an algorithm nonetheless.

          You can (and should) argue that such a simple algorithm doesn’t “count”, but fundamentally the exact wording of the grandparent post never works, legislatively.

          Lawyers will lawyer.

      • Computer0 34 minutes ago
        I made a new YouTube account recently and my homepage was blank.

        https://news.ycombinator.com/item?id=37053817

      • pessimizer 30 minutes ago
        This seems to be consciously dishonest. Show them "most recent" or "most upvoted" or "A to Z." Pretending like this is hard is bizarre. People have always selected sort and filter algorithms, until companies started taking them away.
      • wyre 33 minutes ago
        These are multi-billion dollar companies.

        Its okay if they have some hard problems to solve.

        • SecretDreams 12 minutes ago
          Won't anyone think of the multibillion dollar technolords? They are people too!
    • Aurornis 20 minutes ago
      > If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present

      Hacker News is a site that presents data by algorithm. Under your definition, Hacker News goes away, too.

      A more accurate framing would be that they’re going after personalized recommendation algorithms. It’s not obvious that offering a recommendation algorithm would mean that the site is no longer an impartial common carrier.

      • tencentshill 3 minutes ago
        The algorithm is not personalized. It's the same for every user. No issue there.
      • another-dave 12 minutes ago
        Goes away, or is liable for the content promoted to the frontpage under the OP's take?

        But I'd agree, that it's personalisation rather than just curation that's the issue.

        I think even requiring sites to have a "bring your own algo" version (and where ads are targetted to the algorithm, rather than the person) would cure a lot of ills.

        As is, even with something like Spotify where you _are_ paying there's no easy way to "reset" your profile to neutral recommendations

        • Aurornis 8 minutes ago
          > Goes away, or is liable for the content promoted to the frontpage under the OP's take?

          Same thing. There is no Hacker News if Y Combinator becomes liable for user submitted content.

          It’s an obvious backdoor play to make sites go away. If a site becomes liable for content posted, you cannot allow users to post content without having the site review and take responsibility for every comment and every post.

          The people proposing it haven’t considered how damaging that would be for the ability of individuals to share ideas and their content. When every site with “an algorithm” is liable for content posted, nobody is going to allow you to post something. It’s back to only reading content produced and curated by companies for us. Total own-goal for the individual internet user.

          • buellerbueller 4 minutes ago
            >It’s an obvious backdoor play to make sites go away.

            Oh no.

      • jackdoe 16 minutes ago
        > Hacker News goes away, too.

        so be it.

        • vasco 6 minutes ago
          This is a strange thing to comment on HN. If you truly believed it why would you be here?
          • buellerbueller 2 minutes ago
            The majority of terminally addicted people I have interacted with at length have both recognized the terminal nature of their addiction and been unable to do anything about it.

            That's the nature of addiction.

    • stingraycharles 1 hour ago
      This is one of those things that don’t translate to legal reality very well, as then you have to define “what is an algorithm”.

      Is adding advertisements an algorithm?

      Is including likes an algorithm?

      Is automatically starting the next video after a previous one has finished an algorithm?

      Is infinite scroll an algorithm?

      Etc

      • andybak 1 hour ago
        This kind of complex leglislation already exists in many areas of the law: revenue collection being the most obvious one. We could choose to treat "societal harm" the way we treat "tax collection".

        I'm not saying there aren't infinite edge cases and second-order effects - but we tolerate those already for many things. I'm not pretending this is simple or even desirable - I'm merely stating it's possible if we want to do it.

        My biggest fear is that (like the UK Online safety act) this acts to favour the huge corporations because they are the only ones that can afford a team of lawyers. Any legislation should aim to carve out exceptions to avoid indirectly helping monopolies.

        • stingraycharles 1 hour ago
          Great example. These companies are already experts at circumventing taxes, what makes you think they can’t weasel their way around some arbitrary written law?

          Just look at the malicious compliance that Apple and Google have around the App Store stuff, they’ll find a way to comply with the law and implement different addictive dark patterns.

          I’m not saying that I disagree that these companies need to be regulated, I absolutely do. I just think it’s going to be a complicated process, and not “oh just ban everything that’s an algorithm”.

          And I have absolutely 0 faith in companies like Meta willfully complying.

          • soVeryTired 31 minutes ago
            I have a feeling taxes are possible to circumvent only because a government tends to have one arm that wants to collect taxes, and another that wants to reduce them to encourage certain outcomes (like having a business setting up shop within its borders).

            The US may have this dual incentive structure since it wants to build its tech giants while limiting their control, but the EU doesn't. The arrival of a foreign tech social media giant might make the legislation a bit more palatable to pass.

            It will undoubtedly be complex to regulate all dark patterns away. But there are a few obvious, easy wins. It'd be a shame to make perfect the enemy of good.

          • bootsabota 29 minutes ago
            Yeah it’s a tough situation to figure out.

            But here’s the real problem: people don’t care. And I say that as someone who hasn’t used social media since 2014.

            My observation of people’s behavior indicates that when all is said and done, people don’t care—they would rather get the endorphins from posting, liking, following, etc.

            But the solution is to allow people to control their own algorithm, and to have open source solutions where communities manage their own social network.

            It’s not the algorithm that is the problem it is that people don’t have the choice to curate their own content.

          • AndrewKemendo 34 minutes ago
            Regulated by who?

            There’s no political organization (yes Mamdani actually out-raised cuomo so let that sink in) that isn’t being actively bribed

      • throwawayffffas 31 minutes ago
        "By algorithm" can be easily defined.

        The easy benchmark to setup can easily be, that any feed that displays the data in a way other than the following is considered an editorial choice and thus the platform is liable as a publisher:

        1. In a chronological order, and only filtered based on user selected options.

        2. In any other order explicitly selected by the user.

        An exception can be made to allow filtering out content that violates the platforms terms and conditions.

        Alternatively there can be no exception, effectively making these platforms unworkable. This is also a choice. We do not need these platforms, including this one.

      • kubb 58 minutes ago
        This is some kind of a meme where people believe things can’t be defined in legal terms and therefore can’t be regulated. These people are usually not lawyers.

        Does anyone know where it’s coming from? I can certainly believe that incompetent jurisdictions have a ton of issues with people misapplying the law and using loopholes.

        • biophysboy 43 minutes ago
          Albert Hirschman wrote a great book about the rhetoric people use to stifle policy proposals 35 years ago. “It’s futile; it won’t ever work” is one common argument. It’s not a meme so much as a cynical reflexive intuition
          • AdamN 32 minutes ago
            One that's reinforced by those against whichever legislation or regulation is being proposed.
        • owebmaster 49 minutes ago
          It probably comes from the same pockets that influences legislation
      • baggachipz 12 minutes ago
        Ok so then the "algorithm" must be made available to authorities (or even better, the public at large) and be approved or rejected based on a court or a law. Obviously an algorithm based on "engagement" or "narrative" should be rejected with prejudice every time.
      • orbital-decay 1 hour ago
        "Algorithm" is a method of selecting the content to display. You're listing presentation types, not selection types. Presentation has nothing to do with supervised selection. Selecting the next video in the infinite scroll would be the algorithm, not the infinite scrolling mechanism itself.
      • randunel 1 hour ago
        Everything other than sorting the list of entities by a standard measurement unit (time, length, mass, temperature, amount) needs to be covered by this law.

        The moment you add other entities to the list (e.g. ads inbetween posts), then it's also subject to the same restrictions.

        • stingraycharles 1 hour ago
          This effectively means “every online platform ever” and would also have included MySpace and the OG Yahoo etc, and as such would not really single out the truly bad actors.

          And then we’ll end up with with another cookie-banner style law which had good intentions but actually missed the point entirely.

          • bee_rider 1 hour ago
            Maybe MySpace should be covered. I mean, MySpace probably(?) had the technical capacity to act maliciously in the manner that modern social media sites do, then business model just hadn’t evolved to the modern toxic state yet.

            The cookie banner law is fine for the most part. Sites that do the malicious-compliance thing of over-prompting the user for permissions are providing a strong signal that they are bad actors. It’s about as much as we can expect without banning them entirely…

          • randunel 1 hour ago
            I stopped using facebook around 2015-ish, when they stopped allowing sorting by date. Prior to this, hi5 and the likes also allwoed sorting by date. So no, not every online platform ever.
          • progval 1 hour ago
            It even includes email providers with a spam filter.
      • 3form 1 hour ago
        This doesn't differ much from the legal reality that I've seen. Terms need to be defined, yes. It will require work to do so. And that work should be done even if it's a bother.
      • pessimizer 22 minutes ago
        I don't see a single difficult example here. The answer is "NO." It's strange that you couldn't even find one.

        I mean "Is including likes an algorithm?" You might as well ask if having a dog in the video is an algorithm. Any question about "likes" would be if you're manipulating the video selection based on likes, or is the user given a control to manipulate the video selection based on likes. If it's you it's an algorithm. If it's the user, it's a control. If you lie about the likes, then it's an algorithm. If you're transparent about the likes, then it is a control.

        The other ones aren't even worth discussing. You might as well ask if having a blue logo is an algorithm, or if Comic Sans is an algorithm. "It's all so complicated!"

        -----

        edit: that being said, the EU does not care about this issue at all, and has had plenty of mandate and plenty of time to have done something about it if it did. They are also going to say "it's all so complicated." Because their problem is the unpopularity of center-left neolib governments that are just barely holding on with extreme minority support through bureaucratic means because they wrote the regulations. They want to keep what's came for British Labour during the recent council elections from coming for them.

        So I guarantee that content will somehow become an "algorithm." The goal is to keep people who don't like them from speaking to each other.

    • bee_rider 21 minutes ago
      The conversation has iterated a couple times and one point that people (on this site at least) are stuck on is “well however you rank things—latest, most popular—you’ll need to use some kind of algorithm, maybe quicksort.” This isn’t what the general public or politicians mean when they say “an algorithm” but it does make something of a point, what exactly the general public and politicians mean when they say that… it’s a bit ambiguous.

      I think the EU has fully digested this point, and is focusing on the “addictive design” phrase instead, for good reason. It makes it obvious that the problem is a bit fuzzy and related to the behaviors induced, not some cut-and-dry algorithmic thing.

    • an0malous 10 minutes ago
      It’s so elegant that there’s zero chance the EU will do it since this is all performative for them
    • luke5441 33 minutes ago
      This is a bit of systems difference. Under a french law system you would write laws to regulate the harms away. Under english common law liability court cases about the harm would lead to precedents and then to common law derived from it. Though not an expert on this.
    • rwmj 24 minutes ago
      You'll need to solve the dark pattern where a new account opens on a blank page with a box saying "Would you like us to suggest what you watch here?"
      • PokemonNoGo 20 minutes ago
        Why would anyone go to a new platform if they didn't know anyone to follow there? I don't see a problem there. I download TikTok and search for SexyDancingDinosaur I heard was on there and press follow.
    • shiandow 1 hour ago
      And when does the user decide? Must a platform do nothing to stimmy spam, or even illegal content to qualify as impartial?

      I suppose the answer could be that only platforms that do indeed allow spam or worse are impartial, but that is a tricky position to be in.

      • leogiertz 1 hour ago
        The mechanism would be that if the user has chosen to follow an account then posts from that account falls under common carrier. If the platform choses to show you other posts then it's under their responsibility.
  • anzerarkin 1 hour ago
    I don’t think this is only a kids issue.

    A lot of adults need this too. The addictive apps are very well designed, while most blockers are either too easy to ignore or too annoying to keep using.

    I built a small iOS blocker because I had the same problem. Making it strict enough to actually work without making people hate it is the main challenge.

    • butlike 5 minutes ago
      Toast notifications were the big mistake. Also badges. In my perfect world, the only thing to retain the ability to keep messages alerting the user that someone tried to contact them would be voicemail, subject to the same spam laws as everything else.
    • criddell 46 minutes ago
      On the radio I heard a reporter talking about things China does during school exams. Apparently all schools have exams at the same time and during that period, social media shuts down at night. I forget the exact hours (10pm - 6am maybe). I'm starting to think that would be a great policy in general for everybody.

      I think they also said AI companies go offline during exam hours, but I may have got that wrong.

      • Aurornis 1 minute ago
        Absolutely wild that we’re seeing proposals to shut down parts of the internet and regulate when people can talk to each other on social platforms as a real suggestion on HN.

        I feel like we’ve completely lost the plot when we’re starting to invite government partial Internet shutdowns as a good idea. This is a totalitarian government play.

      • dgellow 22 minutes ago
        I can understand regulating dark/abusive patterns, but at the end of the day I should be allowed to doomscroll at night if I want to
        • buellerbueller 7 minutes ago
          >I can understand regulating dark/abusive patterns, but at the end of the day I should be allowed to doomscroll at night if I [am an addict]
    • actionfromafar 51 minutes ago
      If we afford the same protections to adults, we don't need age verification either. Just a thought.
    • kgwxd 52 minutes ago
      As an adult, who despises all those apps, I don't want to grant government the power to make that decision for me.
      • criddell 45 minutes ago
        An an adult, do you also believe seat belt laws are a bad thing?
        • jayGlow 20 minutes ago
          personally yes, that kind of choice should belong to the individual not the government. besides that though the laws are nonsensical why is a seatbelt required in a car not not in a bus, why are motorcycles even allowed at all?
        • nekusar 20 minutes ago
          Yes, I do. Its just another way that cops can pull you over for bullshit charges and revenue enhancement.

          I remember in my state, it was initially only a citation that couldnt be pulled over on. Then they flipped that and started pulling over for it. Why? Pure fucking money grab.

          Me not wearing a seatbelt means I risk getting splattered. Not you, or anyone else.

          • foobarian 8 minutes ago
            > Me not wearing a seatbelt means I risk getting splattered. Not you, or anyone else.

            Except who pays for your million-dollar reconstructive surgery and rehab? I don't suppose you will cover that out of pocket to avoid burdening your fellow insurance payers with your reckless behavior?

          • aeve890 2 minutes ago
            >Me not wearing a seatbelt means I risk getting splattered. Not you, or anyone else.

            Physics says otherwise. In a collision you don't decide where you body is yeeted and your skull could end inside the skull of a passenger using his seatbelt. Don't be a moron. https://youtube.com/shorts/n2yLMGA_YSA?si=AlvRgfpb-PJxGCBw

          • chinathrow 16 minutes ago
            Is this satire?
  • Pesthuf 1 hour ago
    Tell me: why are these algorithms suddenly okay when the victim turns 18?

    They are bad for everyone and if you’re willing to regulate them, make them illegal to be used on anyone.

    • Mashimo 1 hour ago
      Just from this article it's not clear if the methods like endless scrolling or "watch next video" are going to be regulated based on user age or not.

      It just says the platform who use such methods, often target kids.

    • palata 1 hour ago
      Same as for the cigarette: it's a lot easier to regulate stuff for kids, because we as a society tend to agree that they need to be protected. Much harder to do with adults, because it is much less of a consensus.
  • FinnKuhn 1 hour ago
    I think especially restricting endless scrolling is a good thing overall to reduce the addictiveness of social media and its harmful effects.

    HN having pages instead of a feed or endless list is one of the things I really like about it.

    • nanapipirara 1 hour ago
      For sure.

      The other thing I really love about HN is that titles are all supposed to be boring and to the point. The guidelines[1] for titles are excellent and I wish more of the web and honestly legacy media too would behave that way. Things that are of no interest to me are not trying to waste my time and attention.

      [1] https://news.ycombinator.com/newsguidelines.html

    • ekjhgkejhgk 1 hour ago
      > I think especially restricting endless scrolling

      The actual point is that they are designed to be addictive. "endless scrolling" is just an implementation detail. If you "ban endless scrolling", they'll still be using every other trick to make it addictive.

  • hnthrowaway0315 1 hour ago
    But they are so profitable, and we need them to track people around and create a police state efficiently. Ah let's keep them but just fine them as well for the show.
    • boringg 26 minutes ago
      What else will fund the AI boom but computationally expensive video AI?
  • yipbub 1 hour ago
    Thanks, I'm an adult and I need it too
    • butlike 3 minutes ago
      FWIW, social media use is mediated by ∆FosB expression, so the less you use social media, the less you want to use social media. Timeline of ~3 months.
    • mrosenbjerg 1 hour ago
      Had the exact same thought
  • nirui 15 minutes ago
    You know, yeah, you can crack down "addictive design", but then what?

    If you don't provide better alternative, the "kids" (and please, stop using "kids" as excuse because everybody can see through it now) will just stick on these platforms because, believe or not, these platforms are much MUCH safer than the alternatives.

    How about, let's see the real problem here: 24% of EU children at poverty risk or social exclusion (2024), see https://ec.europa.eu/eurostat/web/products-eurostat-news/w/d.... That's not just a statistic about children, it's also about their parents.

    Do you know that if you go outside, then there's this huge risk of having to PAY for stuff you don't actually need to live? Like transportation to go to place that don't bring you wealth, like drink that you drink even you're not that thirsty, like movie tickets just so it will not be too awkward after all the dialogue options are exhausted? Does these politicians just somehow forgot all of these costs money, in this economy that they helped to create?

    And that is not to mention the REAL risk, such as drugs the bad ones, rude or crazy drivers, unpleasant adults who's only life purpose is to earn enough money to keep them going a little bit longer, just to name a few.

    ..... ORRRR, you can just stay in your conformable home, sit on your soft and warm sofa/couch, and swipe your life away on TikTok or Instagram for free, safely.

    You see the problem here?

    I'm really sick and tired of these politicians putting up this act pretending to "love children", when in the reality what they do is putting up easy patches to hide the real problem, which is poverty and inequality, that's the real problem.

  • tolerance 49 minutes ago
    Either what defines an "adult" is going to be raised exponentially or what defines a "kid" is going to be lowered to determine who is allowed access to information in transit and who needs to be "safeguarded" from it.
  • thedetailsguy 18 minutes ago
    Isn’t it more of “emotional” design than “addictive” design?
  • lp4v4n 43 minutes ago
    I don't agree with this. Addictive, unless we're talking about a chemical substance or something like that, is a subjective thing. At some point, books, movies, comics, etc, etc might have been considered addictive.

    Social networks in general should be banned for underage people, that's the thing. And the social network itself should be liable for verifying the age its users, like a nightclub is liable for people who enter it. No bullshit operating system age verification, that's, trust me, totally intended to protect kids and not to spy on you.

    • butlike 0 minutes ago
      Addictiveness is measured by ∆FosB gene expression. The 'addictiveness' of a substance or activity is qualified by how much ∆FosB is expressed. It's decidedly not just a completely subjective thing. Books, movies, comics, etc. can all still be measured on this scale. Everything is addictive in some capacity, generally.
    • bogwog 6 minutes ago
      > Addictive, unless we're talking about a chemical substance or something like that, is a subjective thing.

      What makes you say that? It's well known that the addictive patterns in these apps trigger dopamine the same way drugs do. In a sense, dopamine is the "chemical substance" central to the addiction. Heroine and algorithms are just different ways to get it.

      https://med.stanford.edu/news/insights/2021/10/addictive-pot...

    • bootsmann 39 minutes ago
      The reason why it is done this way is that “social media” is much harder to delineate and also not what is generally considered harmful.
    • jampekka 26 minutes ago
      Addiction at least is quite straightforward to differentiate from otherwise engaging things by whether it causes significant harmful effects. E.g. per Wikipedia "Addiction is a neuropsychological disorder characterized by a persistent and intense urge to use a drug or engage in a behavior that produces an immediate psychological reward, despite substantial harm and other negative consequences."

      Addictive would be then something that (for a substantial portion of population) has a tendency to cause addiction.

    • simion314 37 minutes ago
      >At some point, books, movies, comics, etc, etc might have been considered addictive

      The difference compared to a book is that a book is not personalized for each individual reader, so the example is not a good one IMHO.

  • caaqil 6 minutes ago
    In the modern world: any tech proposition that starts with protection of children as a goal can be dismissed out of hand, since it's emotional manipulation masquerading as tech policy. When I hear "protect kids", all I see is a sleazy politician bowing to their respective Security State apparatus.
  • seydor 37 minutes ago
    they are going to put kids on a drip basis. addiction is still there, just limited amount per session. Intermittent rewards is actually the perfect schedule for an advertising company, you don't want people to be making unmonetizable page views.
  • garrettjoecox 1 hour ago
    At what point should the responsibility fall on the parent to protect their children from harm?

    Don’t get me wrong, if I had my way TikTok wouldn’t exist for anyone, adults included. It’s just so strange to me that so many parents hand their 7 year olds unrestricted access to TikTok and expect someone else to keep their kid safe.

    • perarneng 1 hour ago
      It's not so easy, they need phones and social media to communicate with their friends. They also need to fit in and find an identity. The algorithms basically all engagement engines have is harmful for humanity as a whole. They are marketed as recommendation engines but it's 100% about engagement and that is why the content you see is mostly creating dopamine from it being fun or rage for it being provocative. It's built to serve one purpose, to keep people using the platform as much as possible. Not because the platform is good, but because it serves content that maximizes engagement.

      I read a post about someone saying his wife worked for a snack company. They used MRI scans to see how much salt (or sugar) they should have in the snacks to maximize the response in the brain. Sounds disturbing right.

      Well engagement engines are the same thing. It's artificial intelligence optimized to get people to react and stay addicted. Basically AI doing harm. It's not what is best for the individual in terms of health. It's what generates most money to the owner of the platform.

      It should not be allowed to build a business around something that exploits humans brains. Basically biohacking our brains for profit.

    • kioleanu 1 hour ago
      I am from Eastern Europe and I’ve been living for many years in Western Europe. Where I come from, kids get their first phones when they start school at 6 (there’s a pre-school year) simply because every other kid has one. I keep coming back in my mind to two examples from my birth country: a friend’s kid carrying an 8 inch smartphone in his hand everywhere because the phone was as big as half his thigh and would have to carry a bag for it. The second one was on a visit at the zoo, I was on a bench and a family with two young children with them, in a cart. And both children, couldn’t have been older than 4 or 5, were scrolling TikTok, that was showing them children content!

      In contrast, in Western Europe, my son is now in the sixth grade, more than half his class doesn’t have phones, phones are absolutely forbidden on school grounds and at school activities, and they are now doing a class trip where they were told that there’s a pay phone at the hotel, in case they want to call the parents - our son promptly informed us that he’ll rather buy a pack of Pokémon cards than call us and 3 days is not so much anyway.

      And it is not only at school, he travels for tournaments with his team every other week and mobile phones are absolutely forbidden on the team bus. Children read, play games (including chess on a magnetic board), sing and change stories for hours at a time

    • tolerance 51 minutes ago
      Apparently parents are spending more time with their children than ever. Dads especially. Paradoxically, there is what you're addressing.

      Personally, I think some parents are afraid of their children growing to resent them for infringing upon their "freedom" in ways that keeping them away from the dangers that social media and other technologies present.

    • Mashimo 1 hour ago
      > the responsibility of a parent to protect their children from harm

      I agree with you, but only in theory. Because that's where we are now and it does not seem to work that well.

      Maybe through more education? But then again I think reducing addictive tactics like endless scrolling could be part of a 2 prong attack.

      With alcohol we have education on what happens, but we also have laws that regulate it.

    • kubb 51 minutes ago
      When it works.
  • bschwarz 1 hour ago
    Imagine the pressure on Instagram and Tiktok to serve better content if they were forced to pick out, say, 100 short videos per person per day. And not just for kids, adults need a break from this addiction machine as well.
  • epolanski 24 minutes ago
    Never understood the kids focus, looks to me like 50+ are by far the most addicted.

    Which makes it also a matter of also parents and grandparents setting good examples.

  • nalekberov 55 minutes ago
    Why, it’s always okay to harm adults?

    Like adults spending their hours scrolling through infinite feed is somehow beneficial to the society?

  • thiago_fm 1 hour ago
    Why should only kids be protected from addiction?

    I have a hard time understanding this.

    We have plenty of adults with terrible social media addiction that is destroying their lives, and nothing being done about it.

    • indymike 16 minutes ago
      This is the best question of all. Why are we allowing this?
    • gib444 1 hour ago
      Makes it an easier sell politically. If you position it as dangerous to kids in particular, your opposition then looks like they're encouraging child harm.
      • palata 59 minutes ago
        Well and if you tell adults that they need to be regulated, they get pissed very, very quickly.
    • Mashimo 1 hour ago
      [dead]
  • LuckyBuddy 1 hour ago
    [dead]
  • sylware 2 hours ago
    Yeah yeah, virtue signaling, and most of EU online services are now gated by the use of one of the whatng cartel web engines (IRL, google blink), namely EU web sites are broken favoring web apps.

    They have to restore interop with noscript/basic html web engines (past/present/and future).

    Then, they have to be carefull with their file formats, for instance you never give "carte blanche" to such a disgusting format like PDF, you are very careful at defining a, as simple a possible, subset of it (with some internal software for validation).

    • Mashimo 1 hour ago
      Is ending endless scrolling really virtue signaling? Don't you think it will have a measurable effect?
    • nanapipirara 2 hours ago
      Yeah yeah, whataboutism.

      I'm very happy they're taking a stance. I've seen too many messed up kids and there's no doubt the addictive design plays a big role in the problem.

      • soco 1 hour ago
        I must notice that every time, but really every time, EU moves a pinky finger against tech industry, a sizeable chunk of comments here will be like the one above. I wonder, is it about a general sentiment against EU? Or a general sentiment against restricting technology? Or a general sentiment against humans? Or what?
        • palata 49 minutes ago
          I think it's easier and safer to complain about everything than to actually have a nuanced and informed stance.

          Look at age verification: it's very easy and very safe to say "nobody sane would think that it is a good idea to force people to show their ID to every website they want to access, it will obviously leak the IDs, that is very bad!". While it is not wrong, it is manipulative, though: that is not the only way to implement age verification. In fact, there is technology that exists that would allow age verification in a privacy-preserving manner: some service that already have access to your ID can give you a token that proves your age, and you can then use this token to access a website. The service cannot know where you use the token, the website cannot know your ID, and they cannot collude.

          So the constructive debate around age verification is this: assuming we implement it properly (i.e. in a privacy-preserving manner), is that something that we want or not? Does it solve a problem, or at least does it help?

          But we cannot ever reach that level of debate, because nobody can't be arsed to get informed about it.

        • ToucanLoucan 1 hour ago
          Boiling kid's (and adult's) brains probably makes them a decent chunk of money, either directly via salary or indirectly via stocks. Ensuring kids remain healthy makes no money. An unfortunately large slice of the tech sector doesn't give the tiniest shit about the health of our broader society or any group in it if it means their lines stop going up, or even go up slightly less fast.
        • eowln 1 hour ago
          The sentiment that having to present our ID to use tiktok gives us the heebie-jeebies, and for good reason.

          Also, nobody voted for the Commission.

          • palata 58 minutes ago
            > The sentiment that having to present our ID to use tiktok gives us the heebie-jeebies, and for good reason.

            So push for privacy-preserving age verification, such that you don't need to leak your ID to anyone but TikTok can still prevent kids from accessing it?

            • eowln 50 minutes ago
              >privacy-preserving age verification

              No such thing.

              • palata 46 minutes ago
                That's my problem with the debate: people like you seem very proud to be uninformed. It exists as much as end-to-end encryption exists. It's cryptography, it's not up to debate.

                But people who have no clue are very vocal about their belief that it does not exist.

                • eowln 27 minutes ago
                  There are no active implementations that do not suffer from severe issues.
                  • soco 1 minute ago
                    There are two ongoing implementations: one weaker in the EU boooo and one good in Switzerland. None having severe issues. Questions?
        • watwut 1 hour ago
          Imo, both. The more right wing people started to have aggressively anti-EU stance once Vance openly stood on the side of Orban and against EU and democracies in general.

          And some people see tech companies as worship worthy and trying to restrict them is kind of a blasphemy.

          • modo_mario 40 minutes ago
            The Vance thing is far too recent and inconsequential across europe?

            The sentiment precedes all that and mostly stems from the EU being in some ways originally lib left dominated and still being seen as facilitating non-eu migration

            Regular right wing people (aka not one of the many parties potentially receiving donations) don't tend to love giant webtech companies. Especially since they feel like they're often used as a tool against them and aren't a local thing that draws nationalists either.

            A focus on privacy also isn't a very left-right defined thing tho i have noticed that the most far reaching expressions of it come a bit more from the further ends of that spectrum. (you'll see some very left leaning people at fosdems privacy focused/related stands for example)

            • dgellow 17 minutes ago
              > don't tend to love giant webtech companie

              That’s a bit outdated since musk bought twitter

  • evanjrowley 1 hour ago
    The most on-brand solution for the EU would be to require mobile phone users to upload brain scans in real-time so the state can check for neural activity associated with addiction.
    • buellerbueller 5 minutes ago
      The most on brand solution for a kneejerk reactionary American would be to satirize the EU for its consumer protections.