Integration of GPT4All into Oxygen?

Started by Theo Gottwald, April 14, 2023, 04:53:27 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Theo Gottwald

Actually the Installation of GPT4All ist very simnple, and its free, even for commercial use.
Means, it can be integrated in any other application.
What if we could train it for O2 and then integrate it into the Editor?
This looks like a next Gen concept to me.
Also good is that it runs completely local and does not transmit data to anywhere.


LINKS:
GPT4All Github: https://github.com/nomic-ai/gpt4all
GPT4All V2 Downloads - https://gpt4all.io/index.html
GPT4All Technical Report: https://static.nomic.ai/gpt4all
LangChain: https://docs.langchain.com/docs/

Charles Pegge

It won't be me but anyone else is welcome to create such a package. Is there anything AI  below 1 meg :)

Theo Gottwald

Quote from: Charles Pegge on April 14, 2023, 05:25:19 PMIt won't be me but anyone else is welcome to create such a package. Is there anything AI  below 1 meg :)
The current state is between 4 Gig and 11 Gig. :-)
But if its not you i think we can put it aside.

Zlatko Vid

<< Also good is that it runs completely local and does not transmit data to anywhere>>

and why i don't trust it ?
  •  

Theo Gottwald

Because you generally trust no one that that has anything to do with USA?
Me also ... :-)

Zlatko Vid

  •  

Charles Pegge

I have some ideas for home-grown AI in the pipeline. It requires some thought on providing good input/sensors and training/feedback. The neural/learning part should be relatively simple.

Theo Gottwald

Maybe you can add it in a way that users can also easily construct and train Networks with variable size of Hidden Layers and variable depth of hidden Layers, as well as Input and Output should be scalable wether between 0-255 or -1 to 1.
Then i could imagine integrated Backpropagation.
Finalyy if the Network has been trained and the Network has the internal Formula, it should be possible to save the "result" directly as ASM-Code "ready to run".

This way O2 can then be used as a "AI Construction Set".

Charles Pegge

My seed concept is to generate some mutating 3d objects with a like button. The metrics of 'liked' are stored and form the basis of further mutating objects. In this initial concept, the metrics do not interact, so there are no neurons, just a simple Darwinian  evolutionary process.

If this proves to be successful, the next step is to do statistical correlations between the metrics to bias the random mutation process to make a likeable object more likely. This is where the neural network idea comes in.

Theo Gottwald

Each of these formulas need to have an target so they can compare "outcome vs. the target" calculate an error and and adjust Inputs.
At the end any of these algos will build an inherent formula that can solve tasks with an low error.

If - after all - there are results that are satisfying it should be possible to compile this inherent formula to AVX512-ASM Code. Then this would be a good thing that can be used generally for all sorts of tasks.

Charles Pegge

#10
In this case, the inputs are various shapes, the feedback is the 'likes', and the eventual target would be an understanding of human preferences. But The mechanism can be abstracted and applied to other domains with different goals, such as structural engineering where the feedback would involve an analysis of materials and stability.

I think that taking concrete examples will naturally yield the necessary algorithms. It's intriguing that practical AI emerges from the advertising industry: Google and Facebook etc.


Theo Gottwald

#11
At the end it will always biol down to having an Input, having a Formula, and learning by changing the factors.
Thats because the world works like that.