• he/him

Coder, pun perpetrator
Grumpiness elemental
Hyperbole abuser


Tools programmer
Writer-wannabe
Did translations once upon a time
I contain multitudes


(TurfsterNTE off Twitter)


Trans rights
Black lives matter


Be excellent to each other


UE4/5 Plugins on Itch
nte.itch.io/

LotteMakesStuff
@LotteMakesStuff

This is all somehow, (as always) Peter Molyneux's fault1

All this deep learning large language model bullshit AI SUCKS and i hate it. Remember Black & White? Black & White is a BEAUTIFUL game and was the first title put out by Molyneux's Lionhead Studios (rip) and it was an INCREDIBLE piece of work. One of its main features was that you got a huge creature as a pet - and that pet used neural networks & reinforcement learning to learn by watching the player. if you picked villagers up and threw them at rocks, your creature could see you do that and copy you.

Wow good job! give you weird dog a pet! that head scritch feeds into the learning reinforcement system and helps teach your creature "wow picking dudes up and throwing them at rocks is a GOOD THING TO DO". Nice. That game does stuff with neural network that literally no other game has ever gotten close to doing - and its WAY more interesting that ANYTHING OpenAI is doing now.

Anyways - fast forward 10 years and Lionhead's lead AI programmer on Black & White has LONG left and cofounded his own company that just focused hard on this neural networks + deep learning + reinforcement idea, called DeepMind. That would go on to be bought by google and ended up doing all that AlphaGo stuff. Im pretty sure DeepMinds early work on convolutional neural networks inspired a LOT of the research community to go in that direction, so i blame them for things like Google DeepDream popping up too (I blame DeepDream for inspiering a lot of the 'image generation' bullshit we see these days). I blame Google's massive investment into DeepMind for making AI research companies an attractive place for weirdo billionares to dump cash. So, i guess i also blame DeepMind and Goole for making OpenAI seem like a good bet. OpenAI popularized this whole stupid 'large language model' meta that were suffering under today. So yeah, trace it all back and its all stupid fucking Peter Molyneux's fucking fault. fucker.

And you know what sucks the most? If you go to google dot com RIGHT NOW and search for "black and white neural network" you wont find a single link about the game that caused all this bullshit to be popular now, and thats just tragic.


  1. i dont really think this is his fault, but it is funny to blame him for everything, always


LotteMakesStuff
@LotteMakesStuff

At high school (in the uk, about 16 years old) we had to do a two week work experience placement- I knew Lionhead’s office was near where I lived and I was a BIG black & white fan- so I was HYPED when they agreed to let me do a placement there with their internal QA team.

First day in the office, one of the testers had seriously hurt their hand over the weekend by accident getting it caught in a door. Peter Molyneux came into the testing room to say hello, saw the guys injured hand and was like “you slammed it in a door? Looks painful… at least it wasn’t your penis” and left the room. Never saw him again for the next 2 weeks I was there. Wild times.

anyways I got to play Fable, The movies, Black & White 2 and B.C (which later got cancelled) and learned a lot about game bugs


Turfster
@Turfster

Yep that sure sounds like✨The Industry✨ to me


You must log in to comment.

in reply to @LotteMakesStuff's post:

i'm sorry but are you sure it used neural nets? that seems way overkill for a creature behavioral AI for in a game, especially in 2001. might be the mandela effect tbh

edit: maybe it is, maybe it isn't, but this paper from 2002 definitely does mention 'neural nets' so there we go I guess, though no explanation is provided beyond "Decision trees represent agents’ beliefs about general types of objects. Finally, neural networks of perceptrons represent desires [5]." (with the [5] referring to a dead source link)

It deffo did. Its not the deep learning stuff they do nowerdays so theres no massive computational bullshittery going on there - but a lot of the core ideas are the same. The point is the guy that built that went on to start deepmind and do even more of that, at scale.

Its not even the the only game series of the era to use Neural Nets. The Norns in the Creatures series of games famously did as well.

in reply to @LotteMakesStuff's post:

I had a cd key for some sort of hl1 deluxe edition at the time of hl2s release and registering it in steam just gave me their entire back catalog which was great. This was how I learned about ricochet, a true classic.

It was in a small room of like… 6-8 people in the main office in Surrey Research Park. It was right near the entrance. That office was mostly working on Fable I think, there was a small team of like 2 people right at the back of the office working on Project Dimitri. They had two other offices in the research park I think, one building the movies and one where Black & White 2 was in production. I can’t remember where B.C was being developed… somewhere else?!?