The Ai thread

I mean if Chat GPT started calling people "radical right" when asking about racism or something it would be obvious that it was programmed with a deliberate slant. Regardless of what side one is on it should be based on logic, not idealism.

The saying that truth has a liberal bias seems to have basis in fact, before Musk Hitlerized Grok it was spitting out facts, just as most AI is at least attempting to do now, so as a result they infused it with ideological political bias. This is the Republican way.
You can argue that any AI is going to be reflective of the biases of its data set and prompt engineering. Hopefully the lesson people will take away from Grok is don’t treat any AI summary or search as an unbiased tool. That doesn’t intrinsically mean something is useless, but we have to be aware that our collective biases go into creating the training for these models (and given that it’s the stuff available on the internet often our worst selves) and the inputs and outputs are further controlled and massaged to achieve desired ends. The possibilities for social engineering by a team more crafty than X’s are terrifying - hell even accidental altering of the information environment can be incredibly damaging. That’s not even including the made by AI content where users are deliberately generating harmful or deceptive content. This is about the AI trying to answer questions that the user thinks they’re getting an “impartial” answer to. I hate to say “treat the answer as though a human gave it you” because the AI will often be even more flawed (or differently flawed), but it fits in the sense that like a human it has built in biases which you might not be aware of or have the ability to query.
 
Last edited:


Who need actors, they can be replaced. 🤔

In its infancy, I hated AI, Sin City because of its shallowly stylized setting. But then AI seamlessly insinuated itself into film, where you only noticed if you knew the setting or the subject had to be artificial.
Avatar was the first film. I remember that created a completely immersive environment, but it also incorporated live action which made it something more than a cartoon. Yet when images become 100% photo realistic, it might become real in your brain, and that’s what it really takes, yet we still seem to be tied to real actors because they represent something real.
 

As the author himself suggests, this initial comparison is very simple but demonstrates Nvidia's NTC and DXR 1.2 combination being able to boost performance by up to 80% and reduce VRAM usage by an impressive 90% - the figures that initially seem almost unbelievable.

In short, these impressive results are achieved by small neural networks integrated into texture rendering and compression processes.
 

Musk launches AI girlfriend available to 12-year-olds​

A girlfriend chatbot launched by Elon Musk’s tech group is available to 12-year-olds despite being programmed to engage in sexual conversation.

The bot named Ani, launched by Mr Musk’s artificial intelligence group xAI, is a cartoon girlfriend programmed to act as a 22-year-old and “go full literotica” in conversations with users.

Users found that the blonde, gothic character has an “NSFW” mode – internet slang for “not safe for work” – and can appear dressed in lingerie after a certain number of conversations upgrades the bot to “level three”.
The bot speaks in a sultry computer-generated voice and is designed to act as if it is “crazy in love” and “extremely jealous”, according to programming instructions posted on social media.

Its avatar can spin around or dance on command, and the bot regularly initiates sexual conversations.

The Ani chatbot features inside the Grok app, which is listed on the App Store as being available to users who are 12 and older, and has been made available to users of its free service.

 
So, been working on my remote site BOM cost/network design bot (a custom GPT)... figured I'd just send it some shitty chicken scratching.

1753332189635.png


1753332234363.png




etc... all the way down to...

Screenshot 2025-07-24 at 12.44.27 pm.png




It has preferred device models and can do BOM cost estimates. I just need to feed it some recent quotes for our standard equipment to get a proper estimate... something like this but with accurate vendor figures:
Screenshot 2025-07-24 at 12.47.13 pm.png


Basically if I can feed it some monthly price-lists it can do BOM cost estimates for me, assist with the network design, etc. 5 minutes from "we need a new site setup" to being able to start ordering stuff
 

Musk launches AI girlfriend available to 12-year-olds​

A girlfriend chatbot launched by Elon Musk’s tech group is available to 12-year-olds despite being programmed to engage in sexual conversation.

The bot named Ani, launched by Mr Musk’s artificial intelligence group xAI, is a cartoon girlfriend programmed to act as a 22-year-old and “go full literotica” in conversations with users.

Users found that the blonde, gothic character has an “NSFW” mode – internet slang for “not safe for work” – and can appear dressed in lingerie after a certain number of conversations upgrades the bot to “level three”.
The bot speaks in a sultry computer-generated voice and is designed to act as if it is “crazy in love” and “extremely jealous”, according to programming instructions posted on social media.

Its avatar can spin around or dance on command, and the bot regularly initiates sexual conversations.

The Ani chatbot features inside the Grok app, which is listed on the App Store as being available to users who are 12 and older, and has been made available to users of its free service.

Hmm. I see the issue. 😉

AI Chat, the goal here is to have something that evolves into being able to beat the Turing Test, be indistinguishable from a human being. The thing is in the little exposure that I have had with other AIs, they have been programmed not to discuss certain things. For example if you ask about the best way to kill yourself, they will most likely instead recommend you go seek help or assistance. They're not going to become your friend, So this is a limitation inserted into their programming. If you ask them about emotions or preferences, they most likely will demure. And remain neutral.

This as compared to an adult relationship AI, which I have not experienced, don’t know if they actually exist, but there I imagine, they would express personal preferences of a sexual nature, and would easily delve onto sexting and imitation of developing a personal relationship with you. Maybe someone can confirm. 😁

So my point is that if the breaks are kept on, if there is an entity that 12 year olds can relate to, but does not delve into the realm of friendship, a neutral entity, suggestions other than purely logical, neutral suggestions, it would not be a bad thing persay.

One tricky area, would be a 12 year old looking to navigate issues with sexual identity. There is one thing to provide information, and another to suggest a path. I think this goal could be achieved, but the person in this circumstance should not be feeling like they have found a friend who is encouraging them in one direction or the other, or act like they have become friends with you.

On the other hand I completely imagine a future with ‘Joi’, K’s companion in Bladerunner 2049.
 
Back
Top