Im planning to put together a dataset to train a LLM, the structure isnt set in stone. use unsloth to train either gpt-oss-20b or qwen3-coder-30b. they work on my 24gb 3090.
overall it will be a local model, ran on lm studio or ollama, converse with llm and it can logically generate code or perform actions in SIE and Utinni.
i converted the files of core3 to a `input : output` of `filepath\filename.ext : file contents` im going to train just that to see how it does.
assuming its okay ill need to make api plugin for SIE then MCP server, maybe utinni too?
i have to look into the legality of training their data, but in a perfect world the tres are unpacked and `filepath\filename.ext : file contents` also trained.
training a image or 3d model will have to be a different nerd if its possible or worth it. im already reaching with this.
thoughts?
overall it will be a local model, ran on lm studio or ollama, converse with llm and it can logically generate code or perform actions in SIE and Utinni.
i converted the files of core3 to a `input : output` of `filepath\filename.ext : file contents` im going to train just that to see how it does.
assuming its okay ill need to make api plugin for SIE then MCP server, maybe utinni too?
i have to look into the legality of training their data, but in a perfect world the tres are unpacked and `filepath\filename.ext : file contents` also trained.
training a image or 3d model will have to be a different nerd if its possible or worth it. im already reaching with this.
thoughts?