• Re: Using Ai For Coding

    From MIKE POWELL@VERT/CAPCITY2/UUMOES to DR. WHAT on Wednesday, February 25, 2026 08:24:00
    I have heard that term but need to read up about it. As I usually hear it in a derogatory reference, I have not been too rushed to learn more about it. ;)

    It's the script kiddies of old going into ChatGPT and typing in a prompt to have it build some software.

    They compile, run, adjust their prompt, which generates completely new code, that they compile, run, ...

    Instead of asking it to pick up where it left off? That doesn't sound
    very efficient and, as you pointed out, means you never get the same code
    (or code walkthru) twice. :/
    ---
    � BgNet 1.0�12 � moe's tavern * 1-502-875-8938 * moetiki.ddns.net:27
  • From phigan@VERT/TACOPRON to MIKE POWELL on Wednesday, February 25, 2026 09:01:07
    Re: Re: Using Ai For Coding
    By: MIKE POWELL to DR. WHAT on Wed Feb 25 2026 08:24 am

    very efficient and, as you pointed out, means you never get the same code (or code walkthru) twice. :/

    That is exactly my first annoyance. I was thinking, surely this isn't how everyone else is using it... how do they put up with it?!

    ---
    � Synchronet � TIRED of waiting 2 hours for a taco? GO TO TACOPRONTO.bbs.io
  • From Dumas Walker@VERT/CAPCITY2 to PHIGAN on Wednesday, February 25, 2026 15:51:42
    very efficient and, as you pointed out, means you never get the same code (or code walkthru) twice. :/

    That is exactly my first annoyance. I was thinking, surely this isn't how everyone else is using it... how do they put up with it?!

    One wonders. I did read an article today that suggested that, in general,
    it is better to ask an AI platform the same question ("prompt") at least
    twice if not three times.

    The reasoning is that these bots are built to be pleasing. If you ask more than once, it is more likely to get a little more into the weeds as to why
    you should do something a certain way or, at the very least, to give a more detailed answer than it does the first time.


    * SLMR 2.1a * Despite the high cost of living, it remains popular.
    ---
    � Synchronet � CAPCITY2 * Capitol City Online
  • From Lonewolf@VERT/BINARYDR to Dumas Walker on Thursday, February 26, 2026 20:56:50
    Re: Re: Using Ai For Coding
    By: Dumas Walker to PHIGAN on Wed Feb 25 2026 03:51 pm

    very efficient and, as you pointed out, means you never get the same code (or code walkthru) twice. :/

    That is exactly my first annoyance. I was thinking, surely this isn't how everyone else is using it... how do they put up with it?!

    One wonders. I did read an article today that suggested that, in general, it is better to ask an AI platform the same question ("prompt") at least twice if not three times.

    I coded my Quantasia AI door to use a tokenized cache so it always remembers the conversation. But that doesn't mean the LLM always responds as expected. Sometimes they have a mind of their own.
    ---
    � Synchronet � Fireside BBS - AI-WX - firesidebbs.com:23231
  • From phigan@VERT/TACOPRON to Lonewolf on Friday, February 27, 2026 08:02:26
    Re: Re: Using Ai For Coding
    By: Lonewolf to Dumas Walker on Thu Feb 26 2026 08:56 pm

    I coded my Quantasia AI door to use
    tokenized cache so it always remembe
    the conversation. But that doesn't m

    Have you looked at all into connecting
    to or interfacing with a self-hosted
    LLM? And, have you read whether any of
    them are better than others?

    I kinda have an itch to install a self
    hosted model for coding.

    ---
    � Synchronet � TIRED of waiting 2 hours for a taco? GO TO TACOPRONTO.bbs.io
  • From Lonewolf@VERT/BINARYDR to phigan on Friday, February 27, 2026 14:14:47
    Re: Re: Using Ai For Coding
    By: phigan to Lonewolf on Fri Feb 27 2026 08:02 am

    I coded my Quantasia AI door to use
    tokenized cache so it always remembe
    the conversation. But that doesn't m

    Have you looked at all into connecting
    to or interfacing with a self-hosted
    LLM? And, have you read whether any of
    them are better than others?

    I kinda have an itch to install a self
    hosted model for coding.

    Yes, on my Quantasia door besides connection to cloud hosted LLMs, it connects to locally hosted ones too. I currently have both Ollama and LM Studio setup for local hosting on a Dell Precision 5280 Intel i9 workstation that has two Nvidia RTX 3060 GPUs for a total of 24 GB vram. I'm liking LM Studio the most so far, as it seems faster than Ollama. But I haven't had a chance to really tweak Ollama to get the most speed out of it either. Plus, I need to really dig into some of the quantized models. I'm currently running a Cogito 30b LLM that is pretty quick and gives a good tokens per second performance and comes back with pretty good results most of the time. I think AI and LLMs are only going to improve from here on out, so its really exciting to play with this stuff. I recommend diving in man, you'll enjoy it.

    LW
    ---
    � Synchronet � Fireside BBS - AI-WX - firesidebbs.com:23231
  • From Lonewolf@VERT/BINARYDR to phigan on Friday, February 27, 2026 14:18:27
    Re: Re: Using Ai For Coding
    By: phigan to Lonewolf on Fri Feb 27 2026 08:02 am

    I coded my Quantasia AI door to use

    Have you looked at all into connecting
    to or interfacing with a self-hosted
    LLM? And, have you read whether any of
    them are better than others?

    I kinda have an itch to install a self
    hosted model for coding.

    Sorry, I forgot to mention that since you are wanting a LLM for coding. I've had good luck with the Qwen 3 For Coding LLM, just be sure to set your max tokens to a higher number so it can keep track of the context of what you're working on.

    LW
    ---
    � Synchronet � Fireside BBS - AI-WX - firesidebbs.com:23231