• 0 Posts
  • 15 Comments
Joined 2 years ago
cake
Cake day: December 15th, 2023

help-circle

  • I’d love to be wrong, but I feel like we are wired in certain ways by the evolutionary process we are the product of. I think the nurture comes in to play with regards to overcoming some of those baser instincts and drives. Anyone who has raised boys can tell you that for most boys they go through phases of being overly aggressive and or violent, that can often be redirected into better ways of getting that out. Can’t speak for girls or people on the intersection due to lack of first hand experience and want to reiterate that I am fully aware that my anecdotes are not universal and everyone falls into a range of behaviours. I feel like what we lack is an elder species we can look up to and emulate, so we are going to need to figure it out for ourselves. I like to think we have the ability, here’s hoping.


  • Your looking at this in a fundamentally different way, you seem to think it’s like electricity or indoor plumbing, where it’s primarily a benefit and enabler of further growth in society.

    I see it like asbestos, or to borrow another posters example radium. A technology that has super narrow ETHICAL applications, but since we have elected to make it the only economic force that is driving large swathes of the world’s markets, we are in the jam it into everything and see how it works out phase. Humanity keeps on making this one fundamental mistake and because we haven’t completely collapsed society and killed ourselves en masse yet we keep on doing it thinking “this time it will turn out differently”.

    I am trying to convey that this is a poison whose LD50 is microscopic, why do we as a society all have to experiment with dosing ourselves to find out how much we can take before it corrodes us to death?

    It’s already taking a bite out of the computing landscape, it’s damaging the environment, its increasing the wealth disparity, its causing actual fatalities and its destroying the ability of people at large to think and retain information. Software development is probably one of the strongest cases for LLM usage, so please tell me how many untrustworthy browsers do we need to offset the above mentioned costs?

    If we had focussed a similar level of effort, and money, into transitioning away from fossile fuel based energy grids as we have on this nonsense the world would be in a better place, but it doesn’t allow for the malignant growth of wealth to the 0.01% percent so it could never happen. Please make me understand why this is a good thing?


  • But they aren’t distinct things, they are both heads of the same capitalism hydra. How much of the training data for these LLMs has been harvested directly from Social Media? I sure as shit don’t know and I would argue nor do many other people.

    Radium is probably a good analogy actually. Thank you. It’s toxic in almost every application we can imagine, it’s got a legacy that extends out to the current day, it formed a massive economic block, and it turns out it should only ever have been used under the strictest controls. We should never have had “entrepreneurs” being the driving force behind it.

    It should have ALWAYS been a controlled substance that required people who understood and respected how fucking dangerous it is. Instead we are intent on jamming LLMs into every aspect of life regardless of how badly we suspect and/or know it will fuck everything up.


  • The problems are human nature, capitalism and greed. Doesn’t mean we have to give in, and frankly all the appeasers out there that keep saying “You have to use it or you will be left behind.” are effectively the drug pusher in the locker room telling the insecure young man “Oh yeah everyone else is juicing, you don’t do it you won’t be able to compete.”

    Nobody believes the drug dealers are handing out drugs because they are humanitarians, they have a financial interest in destroying that kids life while he tries to justify it to himself.

    We know LLMs are harmful on SO many different levels, but the US economy would literally collapse if people acknowledged that and stopped supporting them. So we race headlong towards societal collapse to keep the plates spinning. Sam Altman, Jensen Huang, Elon Musk, and so many others should all be tried for genocide and crimes against humanity once the collapse occurs. The sooner our societies start stringing these monsters up rather than celebrating them the more hope we have as a species.


  • For one, what’s her name? Perhaps I just forgot meeting her, otherwise my point still stands.

    For two, I’m Australian, it would be extremely insensitive for you to make negative statements about my dialect.

    I will admit I am being inflammatory with that language, but honestly the only person I have met who I would call an honestly good person who was super positive about LLMs, giant math nerd who talked over my head for 10 minutes about the implementation of convolutional networks. He might have just been super excited I was taking an interest and doing my best to ask questions that didn’t make me sound to dumb.



  • How many browsers would you like me to list, yes a lot of them are spins on some of the big incumbents, but there is a much wider variety than you might credit. Rendering engines on the other hand, yeah there’s not much variety there.

    Mobile operating systems are something of a special case I’m afraid, the Telcos and incumbents have got way too heavy a thumb on the scale, and if any new comer looks like breaking the duopoly it will be treated as an existential threat. It will be associated with paedophilic terrorists faster than you can blink.

    Both incidentally categories where I will never be happy with slopcode. But hey if anyone wants to use a slop-coded browser I just heavily suggest you never enter any passwords or personal information while using it.

    We are actively building a history of cases where LLM usage correlates heavily with that slope you mentioned, but hey that’s OK, we aren’t allowed to call things out before they happen, judgement may only be passed once the damage is done right?

    Out of curiosity, we know that LLM usage increases cognitive deficit and in some cases leads to psychosis. How many fatalities would you say is an acceptable number before governments act? How degraded do we let our societies get before we reign it in?

    At some point the bubble is going to burst and we will see a number of countries bankrupted in the name of “AI” I’m really curious to see if we learn our lessons at that point. Should be interesting.


  • So LLMs are going to achieve what Microsoft has been unable to, destroy open source and upend the world of coding. Nice. We really are living in the dumbest timeline. Can’t wait for Nintendo’s lawyers to decide they found a fragment of Nintendo code in the output of an LLM and start the lawfare to destroy the pesky breeding ground of emulator writers.

    Said it in another thread, I have yet to meet a strong advocate for LLMs that isn’t a cunt.


  • How do people gain the ability to make these major projects if not for cutting their teeth on the small ones though. We cut the apprentice and journeyman stages of mastering an art out, replace it with slop, and then ten years from now we wonder why kids these days are so incapable of actually creating anything.

    I have talked to kids who have told me that the assignments they got at school were so trivial they just ran them through ChatGPT rather than waste their time. When I pointed out that the reason the assignments were “trivial” was to give them the skills and confidence to do the big projects when the time came I got, at best, blank looks.

    I said it somewhere else, if you are using an LLM to generate unit tests I find it hard to be terribly mad at that. If it’s scaffolding documentation, meh whatever. If it’s generating the main body of your project, I have concerns. Plus I circle back to how can you open source code that may have been stolen from a copyrighted work?


  • And it’s so noisy. We are already losing bug bounties, it’s swamping open source projects in poor quality or even counter productive “work” on github to get recognition, its drowning out the work of creatives, its invading so many aspects of life (education, communication, research, public policy) and its fundamentally a bad tool for so many of those areas.

    I recently applied for a job and got some advice from a friend who works HR in a different industry. His advice, see if you can find out which LLM they use and run your application through it. A lot of positions are getting huge numbers of applicants so they are using LLMs to generate the short list for interview, you could have the absolute perfect application but because the LLM doesn’t like the way you wrote it you are thrown out of the pool without a human being ever seeing you. It’s so insidious, by being “helpful” it reinforces its necessity.


  • I still don’t think quantity is lacking, and when quality is there it’s amazing how often Open Source becomes a defacto standard. How many video tools are just a shim over FFMPEG for example?

    Yet again the problem I see is that LLMs are a seductive form of software cancer, it starts as a little help and before you know it we have booklore like projects. If open source can’t be better it will be subsumed in slop.

    Not disagreeing about LLMs as a weapon. In a functional society the person who pulls the trigger on any weapon is responsible for the consequences of that action. I wonder how eager the CEOs of these “AI” companies would be to weaponise their creations if they were held personally accountable for every injury caused by their product. By a jury. Preferably with explicit laws stating they could not indemnify or gain immunity.


  • I think I can provide you a great equivalent. Firearms, they have utility, but there are people who make them a lifestyle choice, and there are people who make them their whole personality. There are also a lot of people just desperate for an excuse to use one. I grew up with a couple of farmers in the extended family, I would never argue guns should be entirely banned, but I am so glad I live somewhere with sane laws around gun ownership. It would be so nice if we had similar consideration around regulating LLMs.

    The danger to open source as I see it is that LLMs degrade the quality and ability of developers while increasing their throughput, and I have never once heard someone complain that open source lacks quantity, but I hear a lot of people complaining about the quality.



  • And every time the use of LLMs for open source development comes up we get the same tired spiel from people about how it’s just a tool and implications that anyone who doesn’t embrace it with jpy in their heart is just a Luddite.

    It seems to me that it’s less a tool and more like intentionally infecting your project with cancer. Sure it shows all the signs of rapid growth, but metastasization isn’t sustainable or desirable. Plus I am yet to encounter a strong advocate for LLMs who isn’t a cunt.