Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

preInstructions get Cody promps flagged #63293

Open
vdavid opened this issue Jun 17, 2024 · 2 comments
Open

preInstructions get Cody promps flagged #63293

vdavid opened this issue Jun 17, 2024 · 2 comments

Comments

@vdavid
Copy link
Contributor

vdavid commented Jun 17, 2024

Context:
We have an option to add a preInstruction text to Cody in the VS Code extension. Cody adds the preInstruction to the prompt
here.

Problem:
It seems like we are banning users who are using the preInstruction config. It seems quite widespread anecdotally: we’ve seen 3 in the last 24h.

Solution ideas:

  1. Disable preinstructions and ship a stable patch release. If this is an experimental feature (need to check!) then this is OK, otherwise a bad idea.
  2. Move the preInstruction into a separate message.
  3. Check our logic in Cody Gateway to confirm why prompts with the preInstruction are being flagged.
@chrsmith
Copy link
Contributor

Are you sure this is the problem? Cody Gateway just checks that the prompt contains a specific string. So adding anything in addition to the prompt prefix won't cause any problems. (It doesn't even need to be a prefix.)

@chrsmith
Copy link
Contributor

Yes, this seems highly unlikely to cause any problems.

  1. We only append the preInstruction to the intro sent in the LLM prefix.
    const intro = ps`You are Cody, an AI coding assistant from Sourcegraph. ${
        preInstruction ?? ''
    }`.trim()
  1. We then look for existing prompt strings here:

    if hasValidPattern, _ := containsAny(prompt, cfg.AllowedPromptPatterns); len(cfg.AllowedPromptPatterns) > 0 && !hasValidPattern {

  2. With the actual configuration data here:
    https://github.com/sourcegraph/infrastructure/blob/main/cody-gateway/envs/prod/cloudrun/main.tf#L54

So using this wouldn't cause the unknown_prompt flag to trigger. However, the request could get flagged if there were "forbidden phrases" in the preInstruction. Or if it were large enough to influence the "token size of the request".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants