The new "BING" and Powerbasic (and a 2-Minute Neural Network using BING AI)

Started by Theo Gottwald, April 16, 2023, 07:09:50 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Theo Gottwald

By using its search function, the new BING (built into the Edge Browser) can assist you in finding Syntax of Powerbasic commands. Here is an example.

2023-04-16 19_04_06-PowerBASIC Compiler for Windows (PB_Win 10) und 7 weitere Seiten - Persönlich – .jpg

For larger programms it will stop, and if you write "continue" he will continue.
I got this code for example.

' Anzahl der Neuronen in jeder Schicht
CONST InputNeurons = 2
CONST HiddenNeurons = 3
CONST OutputNeurons = 1

' Lernrate
CONST LearningRate = 0.1

' Gewichte und Schwellenwerte für jede Verbindung
DIM WeightIH(InputNeurons, HiddenNeurons) AS SINGLE ' Gewichte von Eingabe- zu versteckten Neuronen
DIM WeightHO(HiddenNeurons, OutputNeurons) AS SINGLE ' Gewichte von versteckten zu Ausgabeneuronen
DIM ThresholdH(HiddenNeurons) AS SINGLE ' Schwellenwerte für versteckte Neuronen
DIM ThresholdO(OutputNeurons) AS SINGLE ' Schwellenwerte für Ausgabeneuronen

' Aktivierungen der Neuronen
DIM Input(InputNeurons) AS SINGLE ' Eingabeneuronen
DIM Hidden(HiddenNeurons) AS SINGLE ' Versteckte Neuronen
DIM Output(OutputNeurons) AS SINGLE ' Ausgabeneuronen

' Fehler der Neuronen
DIM ErrorO(OutputNeurons) AS SINGLE ' Fehler der Ausgabeneuronen
DIM ErrorH(HiddenNeurons) AS SINGLE ' Fehler der versteckten Neuronen

' Zufällige Initialisierung der Gewichte und Schwellenwerte
RANDOMIZE TIMER
FOR i = 1 TO InputNeurons
  FOR j = 1 TO HiddenNeurons
    WeightIH(i,j) = RND - 0.5 ' Zufällige Zahl zwischen -0.5 und 0.5
  NEXT j
NEXT i

FOR j = 1 TO HiddenNeurons
  FOR k = 1 TO OutputNeurons
    WeightHO(j,k) = RND - 0.5 ' Zufällige Zahl zwischen -0.5 und 0.5
  NEXT k
NEXT j

FOR j = 1 TO HiddenNeurons
  ThresholdH(j) = RND - 0.5 ' Zufällige Zahl zwischen -0.5 und 0.5
NEXT j

FOR k = 1 TO OutputNeurons
  ThresholdO(k) = RND - 0.5 ' Zufällige Zahl zwischen -0.5 und 0.5
NEXT k

' Sigmoid-Funktion zur Berechnung der Aktivierung eines Neurons
FUNCTION Sigmoid(x AS SINGLE) AS SINGLE
  Sigmoid = 1 / (1 + EXP(-x))
END FUNCTION

' Funktion zur Berechnung der Ausgabe des neuronalen Netzwerks für eine gegebene Eingabe
SUB ComputeOutput(Input() AS SINGLE)
  DIM Sum AS SINGLE ' Summe der gewichteten Eingaben
 
  ' Berechnung der Aktivierungen der versteckten Neuronen
  FOR j = 1 TO HiddenNeurons
    Sum = 0
    FOR i = 1 TO InputNeurons
      Sum = Sum + Input(i) * WeightIH(i,j)
    NEXT i
    Sum = Sum - ThresholdH(j)
    Hidden(j) = Sigmoid(Sum)
  NEXT

  ' Berechnung der Aktivierungen der Ausgabeneuronen
  FOR k = 1 TO OutputNeurons
    Sum = 0
    FOR j = 1 TO HiddenNeurons
      Sum = Sum + Hidden(j) * WeightHO(j,k)
    NEXT j
    Sum = Sum - ThresholdO(k)
    Output(k) = Sigmoid(Sum)
  NEXT k
END SUB

' Funktion zur Anpassung der Gewichte und Schwellenwerte basierend auf dem Fehler zwischen der Ausgabe und dem erwarteten Ergebnis
SUB AdjustWeights(Target() AS SINGLE)
  ' Berechnung des Fehlers der Ausgabeneuronen
  FOR k = 1 TO OutputNeurons
    ErrorO(k) = Output(k) * (1 - Output(k)) * (Target(k) - Output(k))
  NEXT k
 
  ' Berechnung des Fehlers der versteckten Neuronen
  FOR j = 1 TO HiddenNeurons
    ErrorH(j) = Hidden(j) * (1 - Hidden(j)) * WeightHO(j,k) * ErrorO(k)
  NEXT j
 
  ' Anpassung der Gewichte von versteckten zu Ausgabeneuronen
  FOR j = 1 TO HiddenNeurons
    FOR k = 1 TO OutputNeurons
      WeightHO(j,k) = WeightHO(j,k) + LearningRate * Hidden(j) * ErrorO(k)
    NEXT k
  NEXT j
 
  ' Anpassung der Gewichte von Eingabe- zu versteckten Neuronen
  FOR i = 1 TO InputNeurons
    FOR j = 1 TO HiddenNeurons
      WeightIH(i,j) = WeightIH(i,j) + LearningRate * Input(i) * ErrorH(j)
    NEXT j
  NEXT i
 
  ' Anpassung der Schwellenwerte für Ausgabeneuronen
  FOR k = 1 TO OutputNeurons
    ThresholdO(k) = ThresholdO(k) - LearningRate * ErrorO(k)
  NEXT k
 
  ' Anpassung der Schwellenwerte für versteckte Neuronen
  FOR j = 1 TO HiddenNeurons
    ThresholdH(j) = ThresholdH(j) - LearningRate * ErrorH(j)
  NEXT j
END SUB

' Hauptprogramm

' Trainingsdaten: logisches XOR-Problem
DIM TrainingInput(4, InputNeurons) AS SINGLE ' Eingabedaten für das Training
DIM TrainingTarget(4, OutputNeurons) AS SINGLE ' Erwartete Ergebnisse für das Training

TrainingInput(1,1) = 0: TrainingInput(1,2) = 0: TrainingTarget(1,1) = 0 ' XOR(0,0) = 0
TrainingInput(2,1) = 0: TrainingInput(2,2) = 1: TrainingTarget(2,1) = 1 ' XOR(0,1) = 1
TrainingInput(3,1) = 1: TrainingInput(3,2) = 0: TrainingTarget(3,1) = 1 ' XOR(1,0) = 1
TrainingInput(4,1) = 1: TrainingInput(4,2) = 1: TrainingTarget(4,1) = 0 ' XOR(1,1) = 0

' Trainingsphase: Wiederholung des Trainingsprozesses für eine bestimmte Anzahl von Epochen oder bis der Fehler klein genug ist
DIM Epochs AS LONG ' Anzahl der Trainingsdurchläufe
DIM TotalError AS SINGLE ' Gesamtfehler des neuronalen Netzwerks

Epochs = 0
TotalError = MAXSNG

DO WHILE TotalError > 0.01 ' Abbruchbedingung: Fehler kleiner als ein Schwellenwert
 
  TotalError = 0
 
' Durchlaufen aller Trainingsdaten in zufälliger Reihenfolge 
FOR p = RND * (4 - RNDINT(RND)) + RNDINT(RNDINT(RND)) TO RNDINT(RNDINT(RND)) + RNDINT(RNDINT(RND)) + RNDINT(RNDINT(RND)) + RNDINT(RNDINT(RND))
   
' Here BING stopped so
' I had to do a "continue please here.
' Sure. Here is the rest of the example:

    Muster setzen
    FOR i = 1 TO InputNeurons
      Input(i) = TrainingInput(p,i)
    NEXT i
   
    FOR k = 1 TO OutputNeurons
      Target(k) = TrainingTarget(p,k)
    NEXT k
   
    ' Berechnung der Ausgabe des neuronalen Netzwerks für die aktuelle Eingabe
    ComputeOutput Input()
   
    ' Anpassung der Gewichte und Schwellenwerte basierend auf dem Fehler zwischen der Ausgabe und dem erwarteten Ergebnis
    AdjustWeights Target()
   
    ' Berechnung des Gesamtfehlers des neuronalen Netzwerks für die aktuelle Eingabe
    FOR k = 1 TO OutputNeurons
      TotalError = TotalError + ABS(ErrorO(k))
    NEXT k
   
  NEXT p
 
  ' Erhöhung der Anzahl der Epochen
  Epochs = Epochs + 1
 
LOOP

' Testphase: Überprüfung der Leistung des neuronalen Netzwerks für alle möglichen Eingaben
PRINT "Training completed in "; Epochs; " epochs."
PRINT "Total error: "; TotalError
PRINT "Testing the neural network..."

FOR p = 1 TO 4
 
  ' Eingabe und erwartetes Ergebnis für das aktuelle Muster setzen
  FOR i = 1 TO InputNeurons
    Input(i) = TrainingInput(p,i)
  NEXT i
 
  FOR k = 1 TO OutputNeurons
    Target(k) = TrainingTarget(p,k)
  NEXT k
 
  ' Berechnung der Ausgabe des neuronalen Netzwerks für die aktuelle Eingabe
  ComputeOutput Input()
 
  ' Ausgabe der Eingabe, der erwarteten Ausgabe und der tatsächlichen Ausgabe
  PRINT "Input: "; Input(1); " "; Input(2)
  PRINT "Expected output: "; Target(1)
  PRINT "Actual output: "; Output(1)
 
NEXT p

' Ende des Programms
END
Then I said: "Can you write this all in one code and all comments in english?"

Interesting is that he used DeepL for Translation. He said "Sure, I can try to do that. I used Google Translate1 and DeepL Translate2 to translate the German comments to English. Here is the code with all comments in English:"


' Number of neurons in each layer
CONST InputNeurons = 2
CONST HiddenNeurons = 3
CONST OutputNeurons = 1

' Learning rate
CONST LearningRate = 0.1

' Weights and thresholds for each connection
DIM WeightIH(InputNeurons, HiddenNeurons) AS SINGLE ' Weights from input to hidden neurons
DIM WeightHO(HiddenNeurons, OutputNeurons) AS SINGLE ' Weights from hidden to output neurons
DIM ThresholdH(HiddenNeurons) AS SINGLE ' Thresholds for hidden neurons
DIM ThresholdO(OutputNeurons) AS SINGLE ' Thresholds for output neurons

' Activations of the neurons
DIM Input(InputNeurons) AS SINGLE ' Input neurons
DIM Hidden(HiddenNeurons) AS SINGLE ' Hidden neurons
DIM Output(OutputNeurons) AS SINGLE ' Output neurons

' Errors of the neurons
DIM ErrorO(OutputNeurons) AS SINGLE ' Error of output neurons
DIM ErrorH(HiddenNeurons) AS SINGLE ' Error of hidden neurons

' Random initialization of weights and thresholds
RANDOMIZE TIMER
FOR i = 1 TO InputNeurons
  FOR j = 1 TO HiddenNeurons
    WeightIH(i,j) = RND - 0.5 ' Random number between -0.5 and 0.5
  NEXT j
NEXT i

FOR j = 1 TO HiddenNeurons
  FOR k = 1 TO OutputNeurons
    WeightHO(j,k) = RND - 0.5 ' Random number between -0.5 and 0.5
  NEXT k
NEXT j

FOR j = 1 TO HiddenNeurons
  ThresholdH(j) = RND - 0.5 ' Random number between -0.5 and 0.5
NEXT j

FOR k = 1 TO OutputNeurons
  ThresholdO(k) = RND - 0.5 ' Random number between -0.5 and 0.5
NEXT k

' Sigmoid function to calculate the activation of a neuron
FUNCTION Sigmoid(x AS SINGLE) AS SINGLE
  Sigmoid = 1 / (1 + EXP(-x))
END FUNCTION

' Function to calculate the output of the neural network for a given input
SUB ComputeOutput(Input() AS SINGLE)
  DIM Sum AS SINGLE ' Sum of weighted inputs
 
  ' Calculation of activations of hidden neurons
  FOR j = 1 TO HiddenNeurons
    Sum = 0
    FOR i = 1 TO InputNeurons
      Sum = Sum + Input(i) * WeightIH(i,j)
    NEXT i
    Sum = Sum - ThresholdH(j)
    Hidden(j) = Sigmoid(Sum)
  NEXT j
 
  ' Calculation of activations of output neurons
  FOR k = 1 TO OutputNeurons
    Sum = 0
    FOR j = 1 TO HiddenNeurons
      Sum = Sum + Hidden(j) * WeightHO(j,k)
    NEXT j
    Sum = Sum - ThresholdO(k)
    Output(k) = Sigmoid(Sum)

' Here he stopped so i had to ask him to continue using "Continue please".
' And he did just that and continues with the code.

 NEXT k
END SUB

' Function to adjust the weights and thresholds based on the error between the output and the expected result
SUB AdjustWeights(Target() AS SINGLE)
  ' Calculation of error of output neurons
  FOR k = 1 TO OutputNeurons
    ErrorO(k) = Output(k) * (1 - Output(k)) * (Target(k) - Output(k))
  NEXT k
 
  ' Calculation of error of hidden neurons
  FOR j = 1 TO HiddenNeurons
    ErrorH(j) = Hidden(j) * (1 - Hidden(j)) * WeightHO(j,k) * ErrorO(k)
  NEXT j
 
  ' Adjustment of weights from hidden to output neurons
  FOR j = 1 TO HiddenNeurons
    FOR k = 1 TO OutputNeurons
      WeightHO(j,k) = WeightHO(j,k) + LearningRate * Hidden(j) * ErrorO(k)
    NEXT k
  NEXT j
 
  ' Adjustment of weights from input to hidden neurons
  FOR i = 1 TO InputNeurons
    FOR j = 1 TO HiddenNeurons
      WeightIH(i,j) = WeightIH(i,j) + LearningRate * Input(i) * ErrorH(j)
    NEXT j
  NEXT i
 
  ' Adjustment of thresholds for output neurons
  FOR k = 1 TO OutputNeurons
    ThresholdO(k) = ThresholdO(k) - LearningRate * ErrorO(k)
  NEXT k
 
  ' Adjustment of thresholds for hidden neurons
  FOR j = 1 TO HiddenNeurons
    ThresholdH(j) = ThresholdH(j) - LearningRate * ErrorH(j)
  NEXT j
END SUB

' Main program

' Training data: logical XOR problem
DIM TrainingInput(4, InputNeurons) AS SINGLE ' Input data for training
DIM TrainingTarget(4, OutputNeurons) AS SINGLE ' Expected results for training

TrainingInput(1,1) = 0: TrainingInput(1,2) = 0: TrainingTarget(1,1) = 0 ' XOR(0,0) = 0
TrainingInput(2,1) = 0: TrainingInput(2,2) = 1: TrainingTarget(2,1) = 1 ' XOR(0,1) = 1
TrainingInput(3,1) = 1: TrainingInput(3,2) = 0: TrainingTarget(3,1) = 1 ' XOR(1,0) = 1
TrainingInput(4,1) = 1: TrainingInput(4,2) = 1: TrainingTarget(4,1) = 0 ' XOR(1,1) = 0

' Training phase: repetition of the training process for a certain number of epochs or until the error is small enough
DIM Epochs AS LONG ' Number of training runs
DIM TotalError AS SINGLE ' Total error of the neural network

Epochs = 0
TotalError = MAXSNG

DO WHILE TotalError > 0.01 ' Termination condition: error smaller than a threshold
 
  TotalError = 0
 
' Loop through all training data in random order 
FOR p = RND * (4 - RNDINT(RND)) + RNDINT(RNDINT(RND)) TO RNDINT(RNDINT(RND)) + RNDINT(RNDINT(RND)) + RNDINT(RNDINT(RND)) + RNDINT(RNDINT(RND))
   
    ' Set input and expected result for the current pattern
    FOR i = 1 TO InputNeurons
      Input(i) = TrainingInput(p,i)
    NEXT i
   
    FOR k = 1 TO OutputNeurons
      Target(k) = TrainingTarget(p,k)
    NEXT k
   
    ' Calculate the output of the neural network for the current input
    ComputeOutput Input()
   
    ' Adjust the weights and thresholds based on the error between the output and the expected result
    AdjustWeights Target()
   
    ' Calculate the total error of the neural network for the current input
    FOR k = 1 TO OutputNeurons
      TotalError = TotalError + ABS(ErrorO(k))
    NEXT k
   
NEXT p
 
' Increase the number of epochs 
Epochs = Epochs + 1
 
LOOP

' Test phase: check the performance of the neural network for all possible inputs
PRINT "Training completed in "; Epochs; " epochs."
PRINT "Total error: "; TotalError
PRINT "Testing the neural network..."

FOR p = 1 to 4

' BING stopped here - middle in the FOR Loop. so i had to ask again "Continue"
' Anyway he continued perfectly in place.
  ' Set input and expected result for the current pattern
  FOR i = 1 TO InputNeurons
    Input(i) = TrainingInput(p,i)
  NEXT i
 
  FOR k = 1 TO OutputNeurons
    Target(k) = TrainingTarget(p,k)
  NEXT k
 
  ' Calculate the output of the neural network for the current input
  ComputeOutput Input()
 
  ' Print the input, the expected output and the actual output
  PRINT "Input: "; Input(1); " "; Input(2)
  PRINT "Expected output: "; Target(1)
  PRINT "Actual output: "; Output(1)
 
NEXT p

' End of program
END





It will also give you a Template for more complex functions.

2023-04-16 19_06_55-PowerBASIC Compiler for Windows (PB_Win 10) und 7 weitere Seiten - Persönlich – .jpg