const { DebugBuilder } = require("../utilities/debugBuilder"); const log = new DebugBuilder("server", "chatGptController"); const { createTransaction } = require("./transactionController"); const { Configuration, OpenAIApi } = require('openai'); const configuration = new Configuration({ organization: process.env.OPENAI_ORG, apiKey: process.env.OPENAI_KEY }); const openai = new OpenAIApi(configuration); async function getGeneration(_prompt, callback, { _model = "text-davinci-003", _temperature = 0, _max_tokens = 100}) { // If the temperature is set to null _temperature = _temperature ?? 0; // If the tokens are set to null _max_tokens = _max_tokens ?? 100; // TODO - Get the tokens in the message and subtract that from the max tokens to be sent to the AI log.DEBUG("Getting chat with these properties: ", _prompt, _model, _temperature, _max_tokens) try{ const response = await openai.createCompletion({ model: _model, prompt: _prompt, temperature: _temperature, max_tokens: _max_tokens }); if(!response?.data) return callback(new Error("Error in response data: ", response)); return callback(undefined, response.data); } catch (err){ log.ERROR(err); log.ERROR("Error when handing model request"); //return callback(err, undefined); } //var responseData = response.data.choices[0].text; } /** * Use ChatGPT to generate a response * * @param {*} _prompt The use submitted text prompt * @param {*} param1 Default parameters can be modified * @returns */ exports.submitPromptTransaction = async (prompt, temperature, max_tokens, discord_account_id, callback) => { getGeneration(prompt, (err, gptResult) => { if (err) callback(err, undefined); // TODO - Use the pricing table to calculate discord tokens log.DEBUG("GPT Response", gptResult); const discordTokensUsed = gptResult.usage.total_tokens; if (gptResult){ createTransaction(gptResult.id, discord_account_id, discordTokensUsed, gptResult.usage.total_tokens, 1, async (err, transactionResult) => { if (err) callback(err, undefined); if (transactionResult){ log.DEBUG("Transaction Created: ", transactionResult); callback(undefined, ({ promptResult: gptResult.choices[0].text, totalTokens: discordTokensUsed})); } }); } }, { _temperature: temperature, _max_tokens: max_tokens }); }