Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3.5-turbo API Call structure #16

Closed
appatalks opened this issue Mar 2, 2023 · 7 comments
Closed

3.5-turbo API Call structure #16

appatalks opened this issue Mar 2, 2023 · 7 comments

Comments

@appatalks
Copy link
Owner

appatalks commented Mar 2, 2023

With "gpt-3.5-turbo", need to relook at structure for proper implementation.

Ex:
[
{“role”: “system”, “content”: “You are a helpful assistant that translates English to French.”},
{“role”: “user”, “content”: ‘Translate the following English text to French: “{text}”’}
]

Currently just using user and content line to pass. (easiest to get working after release of gpt-3.5-turbo)

https://platform.openai.com/docs/guides/chat/chat-vs-completions

@appatalks
Copy link
Owner Author

"In general, gpt-3.5-turbo-0301 does not pay strong attention to the system message, and therefore important instructions are often better placed in a user message."

This seems pretty obvious now :/

@appatalks
Copy link
Owner Author

appatalks commented Mar 6, 2023

I couldn't (yet) figure out how to pass an array that is actively updated to messages when user/bot interaction happens.
So, atm I am passing all the history back, but having assistant content mention to know its all part of the previous chat. Seems to work'ish.

messages: [
      { role: 'system', content: "You are Eva. You have access to previous chats and responses. You will keep conversation to a minimum and answer to the best of your abilities." },  // Doesn't seem to stick well.
      { role: 'user', content: selPers.value + "My next question is: " + sQuestion.replace(/\n/g, '') }, 
      { role: 'assistant', content: "Here are all my previous responses for you to remember: " + userMasterResponse.replace(/\n/g, ' ') }, 
],

@appatalks
Copy link
Owner Author

Pretty happy with this or a variation of it:

messages: [
      { role: 'system', content: "You are Eva. You have access to previous chats and responses. You will keep conversation to a minimum and answer to the best of your abilities." }, 
      { role: 'user', content: selPers.value }, 
      { role: 'assistant', content: "Here are all my previous responses for you to remember: " + userMasterResponse.replace(/\n/g, ' ') }, 
      { role: 'user', content: "My next response is: " + sQuestion.replace(/\n/g, '') }, 
],

@appatalks
Copy link
Owner Author

Reopening. I feel I am close to getting this into an active array:

///
// Set initial Messages payload
let iMessages = [
{ role: 'system', content: "You are Eva. You have access to previous chats and responses. You will keep conversation to a minimum and answer to the best of your abilities." },
{ role: 'user', content: selPers.value },
// { role: 'assistant', content: "Here are all my previous responses for you to remember: " + userMasterResponse.replace(/\n/g, ' ') },
// { role: 'user', content: "My next response is: " + sQuestion.replace(/\n/g, '') },
];

// Store messages in local storage
// localStorage.setItem("messages", JSON.stringify(iMessages));

// Retrieve messages from local storage
var cStoredMessages = localStorage.getItem("messages");
// kMessages = cStoredMessages ? JSON.parse(cStoredMessages) : [];
kMessages = iMessages + cStoredMessages ? JSON.parse(cStoredMessages) : [];

// Append last responses to next payload
// Placer
iMessages.push({ role: 'assistant', content: "Here are all my previous responses for you to remember: " + userMasterResponse.replace(/\n/g, ' ') });
iMessages.push({ role: 'user', content: "My next response is: " + sQuestion.replace(/\n/g, '') });

// Store the updated messages array back to local storage
localStorage.setItem("messages", JSON.stringify(iMessages));
///

// API Payload
var data = {
    model: sModel,
    messages: kMessages,
    max_tokens: iMaxTokens,
    temperature:  dTemperature,
    frequency_penalty: eFrequency_penalty,
    presence_penalty: cPresence_penalty,
    stop: hStop
}

@appatalks appatalks reopened this Mar 19, 2023
@appatalks
Copy link
Owner Author

Woot Woot got memeory down. Per community, next goal:

ruby_coder
Regular
(https://community.openai.com/t/gpt-3-5-turbo-vs-text-davinci-003-chatbot/82806/73?u=sbennettkorea)
Very happy atm.
Good for you, @sbennettkorea
It’s great to hear you are enjoying coding.
Next, you can work on code to prune the messages array when it gets too large.

@appatalks
Copy link
Owner Author

Added a Clear Memory button for the time being. I'll revisit this.

@appatalks
Copy link
Owner Author

You know what, I am happy with this. Ill prolly reopen this if I ever run into a limits issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant