Compare commits

..

No commits in common. "main" and "project-thousand" have entirely different histories.

5 changed files with 136 additions and 183 deletions

View file

@ -1,6 +1,6 @@
The Sammy Public License Revision 5 Sub-Revision 1 (SPL-R5 SR1)
The Samuel Public License Revision 5 (SPL-R5)
Copyright (c) 2024 Sneed Group
Copyright (c) 2024 Samuel Lord
This document grants permission, without charge, to any individual acquiring a copy of the software and its associated documentation files (hereinafter referred to as the "Software"). Such individuals are authorized to engage in activities related to the Software with certain restrictions (listed below), including, but not limited to, the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software. These permissions extend to persons to whom the Software is furnished, subject to compliance with the specified conditions outlined below.
@ -10,7 +10,7 @@ In making contributions to the Software, contributors irrevocably assign, transf
Furthermore, this document permits the reuse and redistribution of both executable binaries and source code, contingent upon the inclusion of the previously mentioned copyright notice and permission notice in all copies or substantial portions of the Software. It is imperative that you explicitly acknowledge and agree that the owner(s) retain ownership rights over the aforementioned source code.
Moreover, companies using the Software are encouraged to contribute upstream. Fortune 500 companies are required to make an annual contribution of at least 20,000 USD or an equivalent amount per project used to support the said projects' sustainability unless no donation option is provided. Also, all Fortune 500 companies using said projects are required to contibute their changes upstream as well.
Moreover, companies using the Software are encouraged to contribute upstream. Fortune 500 companies are required to make an annual contribution of at least 20,000 USD or an equivalent amount to support the project's sustainability unless no donation option is provided.
Additionally, note that the use of AI-assisted tools, including but not limited to GitHub Copilot and ChatGPT, is expressly permitted in conjunction with this software. Users are encouraged to leverage these AI tools to enhance their experience in developing, modifying, and interacting with the Software. The permission granted herein extends to the integration and utilization of AI-generated content for coding and communication purposes. The owners(s) of the Software acknowledge and embrace the collaborative nature of AI-assisted development.

View file

@ -1,39 +1,10 @@
# Kode
Your AI ~~script kiddie~~ coder.
## Overview
Kode is your go-to AI ~~script kiddie~~ coder for coding adventures. Whether you're exploring new algorithms, experimenting with cutting-edge technologies, or simply need a coding buddy, Kode has got you covered. With its powerful AI capabilities, Kode can assist you in generating, executing, and refining code with ease.
## Features
- Seamless integration with Ollama and codellama for enhanced an enhanced ~~skidding~~ coding experience.
- Built on modern Node.js LTS for stability and performance.
- Easy setup with just a few dependencies and simple installation steps.
Your AI script kiddie.
## Dependencies
* Ollama
* codellama (will be auto installed)
* Modern NodeJS LTS (as of writing it's v20)
* all code dependencies for this project (will be auto installed)
## Installation
* Install NodeJS LTS.
* Install Ollama.
* Clone the repo.
* Change directory into the cloned repo.
* Run it.
* ```npm run windows``` if you are Windows user.
* ```npm run nix``` if you're on a Linux or Mac machine.
## Usage
Once you have Kode set up, you can start coding with confidence. Simply launch Kode and follow the prompts to define your project, select your preferred programming language, and dive into the world of ~~being a script kiddie~~ coding!
## License
Kode is licensed under The Samuel Public License Revision 5 (SPL-R5). Please see the [LICENSE](LICENSE) file for details.
* codellama
* nodejs
* all dependencies for this project (hint: ```npm i``` in the directory of Kode.)

108
index.js Normal file
View file

@ -0,0 +1,108 @@
import readline from 'readline';
import Ollama from 'ollama-js-client';
import spawn from 'child_process';
let generation = 1;
let potentialAnswers = [];
function prompt(q) {
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
});
return new Promise((resolve) => {
rl.question(`${q}`, function (a) {
rl.close();
resolve(a);
});
});
}
let problem = await prompt("What's the project? (no external libs or reqs): ");
let lang = await prompt("What's the lang? (js, python, ppython [panda3d python]): ");
console.log("coding!");
function langExec(langCode) {
if (lang == "js") {
return eval(langCode);
} else if (lang == "python") {
const pythonProcess = spawn('python', ['-c', langCode]);
// Handle stderr data from the Python process
return pythonProcess.stderr.on('data', (data) => {
return Error(`${data}`);
});
} else if (lang == "ppython") {
const ppythonProcess = spawn('ppython', ['-c', langCode]);
// Handle stderr data from the Python process
return ppythonProcess.stderr.on('data', (data) => {
return Error(`${data}`);
});
} else {
console.error("Language command not found!")
}
}
const instance = new Ollama({
model: "codellama",
url: "http://127.0.0.1:11434/api/",
});
function getLangID() {
if (lang == "ppython") {
return "panda3d python"
} else {
return lang;
}
}
let answer = await instance.prompt(`${problem} - This must be coded in pure ${getLangID()}, no external libraries or requirements. Please provide the code, the full code, and nothing but the code. No chit-chat, no markdown, just code.`);
async function main() {
let answerParsed = ""
let problemSolved = false;
while (problemSolved == false) {
try {
console.log(`Generation ${generation}`)
console.log(answer.response)
answerParsed = answer.response.replaceAll("```javascript","").replaceAll("```","");
langExec(answerParsed);
problemSolved = true;
generation = generation + 1;
} catch (error) {
answer = await instance.prompt(`There was an error: ${error.message}. Please only provide the code, the full code, and nothing but the code. No chit-chat, no markdown, just code. Also, make sure it's written in ${getLangID()} without any libraries besides included.`)
}
}
return answerParsed;
}
async function aThousand() {
let potentialAnswersQuestion = `Which answer is best suited for ${problem}?
If there are two or more answers that are about as equal, but one has lower quality code, choose the one with higher quality code.
Pick ONLY ONE ANSWER.
Answers:
`
for (let i = 0; i < 1000; i++) {
let potentialAnswer = await main();
potentialAnswers.push(potentialAnswer)
}
potentialAnswers.forEach((answer, index) => {
potentialAnswersQuestion += `
----
Answer ${index + 1}:
${answer}
----
`;
});
let finalAnswer = await instance.prompt(`${potentialAnswersQuestion}`)
let finalAnswerParsed = finalAnswer.response;
return finalAnswerParsed;
}
let a = await aThousand();
console.log(a)

125
index.mjs
View file

@ -1,125 +0,0 @@
import readline from 'readline';
import Ollama from 'ollama-js-client';
import { spawn } from 'child_process';
let potentialAnswers = [];
function prompt(q) {
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
});
return new Promise((resolve) => {
rl.question(`${q}`, function (a) {
rl.close();
resolve(a);
});
});
}
let problem = await prompt("<What's the project? (no external libs or reqs)>: ");
let lang = await prompt("<What's the lang? (js, python, ppython [panda3d python])>: ");
let generations = await prompt("<How many answers? (more answers = more memory used and more time!)>: ");
generations = Math.ceil(Number(generations))
console.log("coding, this will take a bit of time!");
function langExec(langCode) {
if (lang == "js") {
return eval(langCode);
} else if (lang == "python" || lang == "ppython") {
const pythonProcess = spawn(lang, ['-c', langCode]);
return new Promise((resolve, reject) => {
let output = '';
pythonProcess.stdout.on('data', (data) => {
output += data.toString();
});
pythonProcess.stderr.on('data', (data) => {
reject(data.toString());
});
pythonProcess.on('close', (code) => {
if (code === 0) {
resolve(output);
} else {
reject(`Process exited with code ${code}`);
}
});
});
} else {
console.error("Language command not found!")
}
}
function getLangID() {
if (lang == "ppython") {
return "panda3d python"
} else {
return lang;
}
}
function replaceAll(str, find, replace) {
return str.replace(new RegExp(find, 'g'), replace);
}
async function main() {
const instance = new Ollama({
model: "codellama",
url: "http://127.0.0.1:11434/api/",
});
let answer = await instance.prompt(`${problem} - This must be coded in pure ${getLangID()}, no external libraries or requirements. Please provide the code, the full code, and nothing but the code. No chit-chat, no markdown, just code.`);
let problemSolved = false;
while (!problemSolved) {
try {
let answerParsed = replaceAll(answer.response, "```javascript", "")
answerParsed = replaceAll(answerParsed, "```", "")
await langExec(answerParsed);
problemSolved = true;
console.log(answerParsed)
return answerParsed;
} catch (error) {
answer = await instance.prompt(`There was an error: ${error.message}. Please only provide the code, the full code, and nothing but the code. No chit-chat, no markdown, just code. Also, make sure it's written in ${getLangID()} without any libraries besides included.`)
}
}
return 1;
}
async function aThousand() {
const instance = new Ollama({
model: "codellama",
url: "http://127.0.0.1:11434/api/",
});
let potentialAnswersQuestion = `Which answer is best suited for ${problem}?
If there are two or more answers that are about as equal, but one has lower quality code, choose the one with higher quality code.
Pick ONLY ONE ANSWER. MUST BE PROGRAMMED IN THE LANGUAGE ${getLangID()}!
INCLUDE THE COMPLETE CODE FOR THE CHOSEN ANSWER, AS WELL AS A SHORT DESCRIPTION ON WHY YOU CHOSE IT AND HOW IT WORKS.
Answers:
`
for (let i = 0; i < generations; i++) {
console.log(`Answer ${i + 1}`);
let potentialAnswer = await main();
potentialAnswers.push(potentialAnswer);
}
potentialAnswers.forEach((answer, index) => {
potentialAnswersQuestion += `
----
Answer ${index + 1}:
${answer}
----
`;
});
let finalAnswer = await instance.prompt(`${potentialAnswersQuestion}`);
let finalAnswerParsed = finalAnswer.response;
return finalAnswerParsed;
}
let a = await aThousand();
console.log(a)

View file

@ -1,22 +1,21 @@
{
"dependencies": {
"node-fetch": "^3.3.2",
"ollama-js-client": "^1.0.1",
"prompt": "^1.3.0"
},
"name": "kode",
"version": "1.0.0",
"description": "Feedback loop to help AI code",
"main": "index.mjs",
"type": "module",
"scripts": {
"windows": "ollama pull codellama && npm i && cls && node index.mjs",
"nix": "ollama pull codellama; npm i; clear; node index.mjs"
},
"repository": {
"type": "git",
"url": "https://nodemixaholic.com:3002/nodemixaholic/kode"
},
"author": "Samuel Lord",
"license": "SEE LICENSE IN LICENSE"
}
{
"dependencies": {
"node-fetch": "^3.3.2",
"ollama-js-client": "^1.0.1",
"prompt": "^1.3.0"
},
"name": "kode",
"version": "1.0.0",
"description": "Feedback loop to help AI code",
"main": "index.js",
"type": "module",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "https://nodemixaholic.com:3002/nodemixaholic/kode"
},
"author": "Samuel Lord",
"license": "SEE LICENSE IN LICENSE"
}