DeepSeek AI

Here you can talk about anything related to BBC BASIC, not covered in another category
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

DeepSeek AI

Post by Richard Russell »

Has anybody tried asking the new DeepSeek AI to write a BBC BASIC program? It is claimed to be better than some others at coding, and I've seen a mind-boggling example of it creating near-perfect Arduino code. Watching it 'reasoning' and 'debating' with itself what approach to use, before actually writing the code, was quite spooky. :shock:

My past experience has been that AIs aren't very competent at writing BBC BASIC code. I suspect there are two main reasons for this: firstly the quantity of BBC BASIC code in their training data is far less than more popular languages; secondly BBC BASIC isn't formally standardised and they don't understand the nuances of the different versions and platforms.

Nevertheless a good AI may well be helpful, even if producing a fully-working program is too much to expect. I'd be interested to learn of people's experience with DeepSeek if they've tried it. The recommendation seems to be to explain the requirements in as much detail as you possibly can, even if that makes the prompt quite lengthy.

And if the code doesn't work, or it obviously can't work without even trying it, explain to the AI as best you can what's wrong because that will improve the chance of it managing to correct its mistakes. Personally I find it disconcerting to hold a 'conversation' with an AI, but it is what they are good at!
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

On 30/01/2025 09:40, terryswanb via groups.io wrote (cross-posted from the Discussion Group):
you must be psychic 🙂 as your message came in I was trying DeepSeek AI with BBC BASIC ?
Perhaps DeepSeek projected the idea into my mind. You can't trust the Chinese! :evil:
I have tried
CHATGPT (free version)
GROK

both of these made silly mistakes like trying to return more than one value from a function using the RETURN command
instead of a single equals sign. Also the code was it seemed in random places , functions etc.
There's one situation when I would consider it legitimate to use RETURN with a DEF FN, and that's when you want to return multiple values plus a Boolean to indicate success or failure. The output values would be returned in the 'pass by reference' parameters and the true/false result in the 'equals' statement. Not that I expect that's what the AIs were actually trying to do. ;)
I wanted to try DeepSeek and I had been looking at Separating Axis Theorem (SAT) this was taking me quite a while to get my head around so I thought it would be the perfect test, I first asked the AI to just create a program to show if 2 triangles we colliding using SAT. It created a program that worked first time!

I then asked if it could show the collision points and it warned me that it was much more complicated but it could do it.

I have uploaded the programs at the address below if anyone is interested.

The only changes I have made to the code is:-
added new start positions for the 2 triangles and added rotation so I could see the collisions in real time.
all the comments and the rest of the code comes from DeepSeek
Impressive. It does seem to have made a mis-step in the FNMAX() and FNMIN() functions however, because it has assumed that a Boolean TRUE is +1 (which it is for example in C) when in fact it's -1 in BBC BASIC:

Code: Select all

DEF FNMAX(a,b) = a*(a>=b) + b*(b>a)
DEF FNMIN(a,b) = a*(a<=b) + b*(b<a)
If I'm not mistaken it will negate the wanted return value! I'm also not at all sure that its decision not to use IF...THEN here is actually justified, in my judgement that would be easier to read and probably faster.
In the zipfile I have added thinking.txt. This is the output from DeekSeek when its thinking through a problem (quite spooky to watch)

https://www.mbelect.co.uk/temp/BBC4W.zip
Fascinating. I would note that this comment is entirely wrong for BBC BASIC: "I should also make sure to define these functions before they are used in the code. So placing the DEF FNMAX and DEF FNMIN at the beginning of the program, perhaps right after setting up the screen mode".
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

Richard Russell wrote: ↑Thu 30 Jan 2025, 12:53 On 30/01/2025 09:40, terryswanb via groups.io wrote (cross-posted from the Discussion Group):
I then asked if it could show the collision points and it warned me that it was much more complicated but it could do it.
You can run the program in your browser here. The only changes I have made are to reduce the window size (which was too big for my laptop) and to use *REFRESH to eliminate the flicker (I don't blame DeepSeek for not including that because it's not generic BBC BASIC).
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

One problem with AIs is that they tend to 'overthink' things, and if you point out a bug or a weakness in the generated code their 'instinct' is to add more complexity rather than to simplify.

For example in the code generated by DeepSeek there is a completely unnecessary, and relatively expensive, range reduction to an angle the range 0-360° (0-2π); I don't know of any common language which requires that:

Code: Select all

 1950 DEFFNcosrad(centre,angle,radius)
 1960 IF angle<0:angle=angle+360
 1970 IF angle>360:angle=angle-360
 1980 REM PRINT "cos ";angle
 1990 = centre+(radius*COSRAD angle)
Assuming it was in the original program I'm quite surprised (but pleased) that DeepSeek used an EXIT FOR, because this is an extension present only in my BASICs and Matrix Brandy:

Code: Select all

  730     IF max1 < min2 OR max2 < min1 THEN
  740       collision% = FALSE
  750       EXIT FOR
  760     ENDIF
Perhaps not surprisingly, DeepSeek doesn't seem to know which versions of BBC BASIC have which extensions, because the presence of EXIT implies acceptance of compound assignment operators, but it still codes:

Code: Select all

angle=angle+360
rather than:

Code: Select all

angle+=360
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

On 30/01/2025 09:40, terryswanb via groups.io wrote:
In the zipfile I have added thinking.txt. This is the output from DeekSeek when its thinking through a problem (quite spooky to watch)
I think the spookiness arises in part from the 'conversational' nature of the responses, such as saying "I remember". I wondered whether this behaviour was programmed in by the AI developers or 'emerges' from the training data, so I asked an AI (Copilot in this case):

"Great question! Phrases like "I remember" and "Let me think" are typically designed to make AI interactions feel more natural and human-like. These phrases can be programmed in by developers to enhance the conversational experience. However, the ability to use such phrases effectively can also emerge from the AI's training data, as it learns from vast amounts of text how humans typically converse.

So, it's a combination of both: intentional design by developers and the AI's learning from its training data. This helps create a more engaging and relatable interaction for users
".

So hedging its bets, but interesting nevertheless (and, of course, a beautifully written piece of prose).
Edja
Posts: 64
Joined: Tue 03 Apr 2018, 12:07
Location: Belgium

Re: DeepSeek AI

Post by Edja »

Fiddling around with DeepSeek to create BB4W code

The exchange about DeepSeek generating BB4W (or BBC BASIC for SDL) is very interesting. The underlying topics, very mathematical mostly, were “out of my comfort zone” and maybe others may have the same feeling. But they intrigued me to make some simple attempts to make DeepSeek work for me.
At first, I hesitated to post this on the forum because what I was doing is so trivial. Then I decided to share this attempt anyway for the benefit of those who would appreciate a simple down to earth example of what can be accomplished with DeepSeek and BB4W.

I’ve programmed a procedure some years ago to compare two arrays of structures. The procedure works fine but I have to admit I always found I did a clumsy job. On the other hand: if it works don’t fix it.
But now, with DeepSeek available, I wanted to explore if AI could do a better job in a fraction of the time. So, this is what I asked DeepSeek to do:

Make a procedure in BBC Basic for Windows that compares two arrays. The first array is defined as DIM Tyd{(AantalEffTyd)Tick$,Curr$,Xrate,KoersCurr,SlotCurr,Risk$,Adv$,Date$,Hour$,Markt$}. The second array is defined as DIM All{(AantalEffAll)Tick$,Name$,Isin$,Curr$,Qty,Xrate,Adv$}
Each element of Tyd{} must also be present in All{} and also the other way around. The criterium to check this is the concatenation of Tick$+Curr$. Give a message if an element is missing.


The result came within seconds. Here is the listing (and it works as intended):

Code: Select all

      REM Procedure to compare two global arrays Tyd{} and All{}
      REM Ensure the arrays are already defined and populated
      REM Tyd{} structure: DIM Tyd{(AantalEffTyd)Tick$,Curr$,Xrate,KoersCurr,SlotCurr,Risk$,Adv$,Date$,Hour$,Markt$}
      REM All{} structure: DIM All{(AantalEffAll)Tick$,Name$,Isin$,Curr$,Qty,Xrate,Adv$}


      DEF PROC_CompareArrays
      LOCAL I%, J%, Found%, TydKey$, AllKey$
      PRINT "AantalEffTyd   =   "  AantalEffTyd
      REM Check if each element in Tyd{} exists in All{}
      FOR I% = 1 TO AantalEffTyd
        TydKey$ = Tyd{(I%)}.Tick$ + Tyd{(I%)}.Curr$
        Found% = FALSE
  
        FOR J% = 1 TO AantalEffAll
          AllKey$ = All{(J%)}.Tick$ + All{(J%)}.Curr$
          IF TydKey$ = AllKey$ THEN Found% = TRUE : EXIT FOR
        NEXT J%
  
        IF  Found%=FALSE THEN
          PRINT "Element in Tyd{} missing in All{}: " + Tyd{(I%)}.Tick$ + ", " + Tyd{(I%)}.Curr$ + ", " + "N/A"
        ENDIF
      NEXT I%

      REM Check if each element in All{} exists in Tyd{}
      FOR I% = 1 TO AantalEffAll
        AllKey$ = All{(I%)}.Tick$ + All{(I%)}.Curr$
        Found% = FALSE
  
        FOR J% = 1 TO AantalEffTyd
          TydKey$ = Tyd{(J%)}.Tick$ + Tyd{(J%)}.Curr$
          IF AllKey$ = TydKey$ THEN Found% = TRUE : EXIT FOR
        NEXT J%
  
        IF  NOT Found% THEN
          PRINT "Element in All{} missing in Tyd{}: " + All{(I%)}.Tick$ + ", " + All{(I%)}.Curr$ + ", " + All{(I%)}.Name$
        ENDIF
      NEXT I%

      ENDPROC
In the first attempt DeepSeek produced code assuming the first array element to be at index 0. Once I've specified "Array element 0 is not used" DeepSeek made the required modification as shown above.
So iterations may be necessary to arrive at a working result.
In a next iteration I've asked to use lowercase letters for LOCAL variables and to avoid % at the end of a variable name unless strictly required. And that too produced functioning code (but not shown here as I wanted to illustrate what can be expected with a minimal set of directives).

Admittedly this is simple code but it shows the potential and I’m quite enthusiastic about the other ideas I want to try out.

So I tried to make DeepSeek produce a BB4W-program that reads from and writes to Excel files using COMLIB. It was a learning process both for me as for DeepSeek. Again, I had to correct DeepSeek on a few things. For example: in the first attempt DeepSeek didn’t add the PROC_cominit needed to initialize COMLIB. But further on DeepSeek made improvements and it produced the code to activate Excel, open an Excel file, write a number in cell A3 and format this as Bold characters, close and save the file.
Then I've started to experiment further and tried to format cell A3 with red characters and a yellow background. No success yet.
For now I have no need to modify my existing programs but purely out of curiosity I may do some more experimenting. If something interesting comes out of this and if there is an interest I'll post it here.
Edja
Posts: 64
Joined: Tue 03 Apr 2018, 12:07
Location: Belgium

Re: DeepSeek AI

Post by Edja »

My past experience has been that AIs aren't very competent at writing BBC BASIC code. I suspect there are two main reasons for this: firstly the quantity of BBC BASIC code in their training data is far less than more popular languages; secondly BBC BASIC isn't formally standardised and they don't understand the nuances of the different versions and platforms.
Wouldn’t it be a good idea to feed the BB4W and BBC SDL documentation to DeepSeek, both manual as well as the documentation of the libraries, programming examples ..... available on the WIKI pages ?
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

Edja wrote: ↑Mon 10 Feb 2025, 11:55 The result came within seconds. Here is the listing (and it works as intended):
I'm surprised that DeepSeek was able to understand structure arrays at all, to be honest, since these are an extension unique to my 'modern' versions and not what most people would call part of the 'BBC BASIC language' as such. It makes me wonder where it has trawled that knowledge from. :o
So I tried to make DeepSeek produce a BB4W-program that reads from and writes to Excel files using COMLIB.
Again, a BB4W-specific library is so esoteric that I wouldn't have expected DeepSeek to have a clue, and the fact that you were able to 'educate' it to produce something useful is astonishing.
It was a learning process both for me and for DeepSeek....
It's a shame that you can't tell it to 'save' such newly-acquired 'knowledge' to use on a future occasion, but I understand why that capability is disabled.
If something interesting comes out of this and if there is an interest I'll post it here.
Please do. You tried things that I never would, because of their specificity to my versions of BBC BASIC and to Windows. It shows that using an AI as a kind of 'collaborator', with which to exchange ideas and cooperatively come up with a solution (as people have found useful in other areas), can work in the esoteric world of BBC BASIC programming too.

Maybe it's all a fake, and there are tens of thousands of Chinese people pretending to be an AI to impress the rest of the world!! :lol:
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

Edja wrote: ↑Mon 10 Feb 2025, 12:11 Wouldn’t it be a good idea to feed the BB4W and BBC SDL documentation to DeepSeek
One can understand why AI companies don't allow any old contributor to supply training data! As it is, I don't know what safeguards are in place to prevent the corruption of the AI with false, misleading or politically-biassed information. There lie the risks of AI as scarily portrayed in sci-fi novels and films, which we don't want to come true any time soon!
Richard Russell
Posts: 272
Joined: Tue 18 Jun 2024, 09:32

Re: DeepSeek AI

Post by Richard Russell »

Another interesting fact about modern AIs, including LLMs like DeepSeek, is that the main internal functional block is the Perceptron, a simulated neuron invented back in the 1950s (originally implemented using analogue electronics) which performs a weighted-sum of its inputs followed by some non-linear transfer function to clip the output to the range 0.0-1.0.

Although Perceptrons are now digital, and modern AIs contain huge numbers of them, the basic concept hasn't changed. Apart from sheer scale, the only major ways in which a modern AI differs from the original Perceptron is that Multi-Layer Perceptrons are now used (the original single-layer Perceptron could only solve linearly-separable problems) and that an extra stage called Attention is added to the process.

The complete AI works by alternately applying Attention and Perceptron steps in the hope that the result will iteratively converge to the required answer (in the case of an LLM that will be a single token or word, which is then output). By some 'magic' the scaling-up process imbues the AI with more capabilities than you might expect from its construction. :shock:

The simplest possible Multi-Layer Perceptron is one with two inputs and two layers. Famously, this can learn to carry out an exclusive-or operation on the inputs, a problem which is not linearly-separable and and cannot be solved by a single-layer Perceptron. I asked DeepSeek to write BBC BASIC code for a two-input, two-layer Perceptron and this is what it produced. It worked first time:

Code: Select all

   10 DATA 0,0,0
   20 DATA 0,1,1
   30 DATA 1,0,1
   40 DATA 1,1,0
   50 DIM inputs(4,2), targets(4)
   60 FOR i% = 1 TO 4
   70   READ inputs(i%,1), inputs(i%,2), targets(i%)
   80 NEXT
   90 REM Initialize weights
  100 DIM W1(3,2), W2(3,1)
  110 FOR i% = 1 TO 3
  120   FOR j% = 1 TO 2
  130     W1(i%,j%) = RND(1) - 0.5
  140   NEXT
  150 NEXT
  160 FOR i% = 1 TO 3
  170   W2(i%,1) = RND(1) - 0.5
  180 NEXT
  190 eta = 0.5
  200 epochs% = 10000
  210 FOR epoch% = 1 TO epochs%
  220   FOR ex% = 1 TO 4
  230     x1 = inputs(ex%,1)
  240     x2 = inputs(ex%,2)
  250     t = targets(ex%)
  260     REM Forward pass to hidden layer
  270     h1_sum = x1*W1(1,1) + x2*W1(2,1) + W1(3,1)
  280     h1_act = 1 / (1 + EXP(-h1_sum))
  290     h2_sum = x1*W1(1,2) + x2*W1(2,2) + W1(3,2)
  300     h2_act = 1 / (1 + EXP(-h2_sum))
  310     REM Forward pass to output
  320     o_sum = h1_act*W2(1,1) + h2_act*W2(2,1) + W2(3,1)
  330     o_act = 1 / (1 + EXP(-o_sum))
  340     REM Compute output delta
  350     error = t - o_act
  360     delta_output = error * o_act * (1 - o_act)
  370     REM Compute hidden deltas
  380     delta_h1 = delta_output * W2(1,1) * h1_act * (1 - h1_act)
  390     delta_h2 = delta_output * W2(2,1) * h2_act * (1 - h2_act)
  400     REM Update output weights
  410     W2(1,1) = W2(1,1) + eta * delta_output * h1_act
  420     W2(2,1) = W2(2,1) + eta * delta_output * h2_act
  430     W2(3,1) = W2(3,1) + eta * delta_output
  440     REM Update hidden weights (h1)
  450     W1(1,1) = W1(1,1) + eta * delta_h1 * x1
  460     W1(2,1) = W1(2,1) + eta * delta_h1 * x2
  470     W1(3,1) = W1(3,1) + eta * delta_h1
  480     REM Update hidden weights (h2)
  490     W1(1,2) = W1(1,2) + eta * delta_h2 * x1
  500     W1(2,2) = W1(2,2) + eta * delta_h2 * x2
  510     W1(3,2) = W1(3,2) + eta * delta_h2
  520   NEXT ex%
  530 NEXT epoch%
  540 REM Test the network
  550 PRINT "Trained results:"
  560 FOR ex% = 1 TO 4
  570   x1 = inputs(ex%,1)
  580   x2 = inputs(ex%,2)
  590   h1_sum = x1*W1(1,1) + x2*W1(2,1) + W1(3,1)
  600   h1_act = 1 / (1 + EXP(-h1_sum))
  610   h2_sum = x1*W1(1,2) + x2*W1(2,2) + W1(3,2)
  620   h2_act = 1 / (1 + EXP(-h2_sum))
  630   o_sum = h1_act*W2(1,1) + h2_act*W2(2,1) + W2(3,1)
  640   o_act = 1 / (1 + EXP(-o_sum))
  650   PRINT "Input ";x1;" ";x2;" Output ";o_act
  660 NEXT
  670 END
This Perceptron has nine weights (6 in the first layer and 3 in the second layer), which are iteratively adjusted over 10,000 repetitions to closely approximate the optimum solution for the exclusive-or problem.

I also asked DeepSeek to code a two-layer Perceptron with more than two inputs, but that seemed to give it indigestion!