Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CON-144] OpenAIConnectionError probably caused by failed SSL certificates #218

Closed
sestinj opened this issue Jul 6, 2023 · 16 comments
Closed
Assignees
Labels
kind:bug Indicates an unexpected problem or unintended behavior

Comments

@sestinj
Copy link
Contributor

sestinj commented Jul 6, 2023

microsoft/semantic-kernel#627
https://community.openai.com/t/ssl-certificate-verify-failed/32442/39

Seems like there's something weird going on with OpenAI. Want to make sure that this is the actual issue, then can consider directly making the request, though I don't know if I expect this to work. Other alternative is to download the OpenAI certificates, but people are claiming it doesn't work, so don't really want to spend time on this.

From SyncLinear.com | CON-144

@sestinj sestinj self-assigned this Jul 6, 2023
@sestinj sestinj changed the title OpenAIConnectionError probably caused by failed SSL certificates [CON-144] OpenAIConnectionError probably caused by failed SSL certificates Jul 6, 2023
@sestinj sestinj added the bug label Jul 9, 2023
@mysticaltech
Copy link

Yes, getting the same error when following the tutorial here to use our own OpenAI API as described here https://continue.dev/docs/customization#change-the-default-llm.

Traceback (most recent call last):

  File "aiohttp/connector.py", line 980, in _wrap_create_connection

  File "asyncio/base_events.py", line 1103, in create_connection

  File "asyncio/base_events.py", line 1133, in _create_connection_transport

  File "asyncio/futures.py", line 285, in __await__

  File "asyncio/tasks.py", line 304, in __wakeup

  File "asyncio/futures.py", line 201, in result

  File "asyncio/sslproto.py", line 534, in data_received

  File "asyncio/sslproto.py", line 188, in feed_ssldata

  File "ssl.py", line 975, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)


The above exception was the direct cause of the following exception:


Traceback (most recent call last):

  File "openai/api_requestor.py", line 588, in arequest_raw

  File "aiohttp/client.py", line 536, in _request

  File "aiohttp/connector.py", line 540, in connect

  File "aiohttp/connector.py", line 901, in _create_connection

  File "aiohttp/connector.py", line 1209, in _create_direct_connection

  File "aiohttp/connector.py", line 1178, in _create_direct_connection

  File "aiohttp/connector.py", line 982, in _wrap_create_connection

aiohttp.client_exceptions.ClientConnectorCertificateError: Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')]


The above exception was the direct cause of the following exception:


Traceback (most recent call last):

  File "continuedev/src/continuedev/core/autopilot.py", line 260, in _run_singular_step
    observation = await step(self.continue_sdk)

  File "continuedev/src/continuedev/core/main.py", line 326, in __call__
    return await self.run(sdk)

  File "continuedev/src/continuedev/plugins/steps/chat.py", line 62, in run
    async for chunk in generator:

  File "continuedev/src/continuedev/libs/llm/openai.py", line 122, in stream_chat
    async for chunk in await openai.ChatCompletion.acreate(

  File "openai/api_resources/chat_completion.py", line 45, in acreate

  File "openai/api_resources/abstract/engine_api_resource.py", line 217, in acreate

  File "openai/api_requestor.py", line 300, in arequest

  File "openai/api_requestor.py", line 605, in arequest_raw

openai.error.APIConnectionError: Error communicating with OpenAI

@Geczy
Copy link

Geczy commented Aug 8, 2023

same issue, makes this extension unusable :(

also unreadable on macos

CleanShot 2023-08-08 at 10 45 15@2x
Traceback (most recent call last):

  File "urllib3/connectionpool.py", line 703, in urlopen

  File "urllib3/connectionpool.py", line 386, in _make_request

  File "urllib3/connectionpool.py", line 1042, in _validate_conn

  File "urllib3/connection.py", line 419, in connect

  File "urllib3/util/ssl_.py", line 449, in ssl_wrap_socket

  File "urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl

  File "ssl.py", line 513, in wrap_socket

  File "ssl.py", line 1071, in _create

  File "ssl.py", line 1342, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "requests/adapters.py", line 486, in send

  File "urllib3/connectionpool.py", line 787, in urlopen

  File "urllib3/util/retry.py", line 592, in increment

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 24, in encoding_for_model
    return tiktoken.encoding_for_model(aliases.get(model_name, model_name))

  File "tiktoken/model.py", line 75, in encoding_for_model

  File "tiktoken/registry.py", line 63, in get_encoding

  File "tiktoken_ext/openai_public.py", line 64, in cl100k_base

  File "tiktoken/load.py", line 116, in load_tiktoken_bpe

  File "tiktoken/load.py", line 48, in read_file_cached

  File "tiktoken/load.py", line 24, in read_file

  File "requests/api.py", line 73, in get

  File "requests/api.py", line 59, in request

  File "requests/sessions.py", line 589, in request

  File "requests/sessions.py", line 703, in send

  File "requests/adapters.py", line 517, in send

requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "urllib3/connectionpool.py", line 703, in urlopen

  File "urllib3/connectionpool.py", line 386, in _make_request

  File "urllib3/connectionpool.py", line 1042, in _validate_conn

  File "urllib3/connection.py", line 419, in connect

  File "urllib3/util/ssl_.py", line 449, in ssl_wrap_socket

  File "urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl

  File "ssl.py", line 513, in wrap_socket

  File "ssl.py", line 1071, in _create

  File "ssl.py", line 1342, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "requests/adapters.py", line 486, in send

  File "urllib3/connectionpool.py", line 787, in urlopen

  File "urllib3/util/retry.py", line 592, in increment

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "continuedev/src/continuedev/core/autopilot.py", line 260, in _run_singular_step
    observation = await step(self.continue_sdk)

  File "continuedev/src/continuedev/core/main.py", line 326, in __call__
    return await self.run(sdk)

  File "continuedev/src/continuedev/plugins/steps/chat.py", line 62, in run
    async for chunk in generator:

  File "continuedev/src/continuedev/libs/llm/maybe_proxy_openai.py", line 49, in stream_chat
    async for item in resp:

  File "continuedev/src/continuedev/libs/llm/proxy_server.py", line 87, in stream_chat
    messages = compile_chat_messages(

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 127, in compile_chat_messages
    msgs_copy = prune_chat_history(

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 56, in prune_chat_history
    sum(count_chat_message_tokens(model_name, message)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 56, in <genexpr>
    sum(count_chat_message_tokens(model_name, message)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 41, in count_chat_message_tokens
    return count_tokens(model_name, chat_message.content) + TOKENS_PER_MESSAGE

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 32, in count_tokens
    encoding = encoding_for_model(model_name)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 26, in encoding_for_model
    return tiktoken.encoding_for_model("gpt-3.5-turbo")

  File "tiktoken/model.py", line 75, in encoding_for_model

  File "tiktoken/registry.py", line 63, in get_encoding

  File "tiktoken_ext/openai_public.py", line 64, in cl100k_base

  File "tiktoken/load.py", line 116, in load_tiktoken_bpe

  File "tiktoken/load.py", line 48, in read_file_cached

  File "tiktoken/load.py", line 24, in read_file

  File "requests/api.py", line 73, in get

  File "requests/api.py", line 59, in request

  File "requests/sessions.py", line 589, in request

  File "requests/sessions.py", line 703, in send

  File "requests/adapters.py", line 517, in send

requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))

@mysticaltech
Copy link

mysticaltech commented Aug 8, 2023

@sestinj @TyDunn This is generalized now it seems.

My ~/.continue/config.py is as follows:

"""
This is the Continue configuration file.

See https://continue.dev/docs/customization to learn more.
"""

import subprocess

from continuedev.src.continuedev.core.main import Step
from continuedev.src.continuedev.core.sdk import ContinueSDK
from continuedev.src.continuedev.core.models import Models
from continuedev.src.continuedev.core.config import SlashCommand, ContinueConfig
from continuedev.src.continuedev.plugins.context_providers.google import (
    GoogleContextProvider,
)
from continuedev.src.continuedev.libs.llm.openai import OpenAI
from continuedev.src.continuedev.plugins.policies.default import DefaultPolicy

from continuedev.src.continuedev.plugins.steps.open_config import OpenConfigStep
from continuedev.src.continuedev.plugins.steps.clear_history import ClearHistoryStep
from continuedev.src.continuedev.plugins.steps.feedback import FeedbackStep
from continuedev.src.continuedev.plugins.steps.comment_code import CommentCodeStep
from continuedev.src.continuedev.plugins.steps.main import EditHighlightedCodeStep
from continuedev.src.continuedev.plugins.context_providers.search import (
    SearchContextProvider,
)
from continuedev.src.continuedev.plugins.context_providers.diff import (
    DiffContextProvider,
)


class CommitMessageStep(Step):
    """
    This is a Step, the building block of Continue.
    It can be used below as a slash command, so that
    run will be called when you type '/commit'.
    """

    async def run(self, sdk: ContinueSDK):
        # Get the root directory of the workspace
        dir = sdk.ide.workspace_directory

        # Run git diff in that directory
        diff = subprocess.check_output(["git", "diff"], cwd=dir).decode("utf-8")

        # Ask the LLM to write a commit message,
        # and set it as the description of this step
        self.description = await sdk.models.default.complete(
            f"{diff}\n\nWrite a short, specific (less than 50 chars) commit message about the above changes:"
        )

OPENAI_API_KEY = "sk-xxx"
config = ContinueConfig(
    # If set to False, we will not collect any usage data
    # See here to learn what anonymous data we collect: https://continue.dev/docs/telemetry
    allow_anonymous_telemetry=True,
    models=Models(
        default=OpenAI(model="gpt-4", api_key=OPENAI_API_KEY),
        medium=OpenAI(model="gpt-3.5-turbo", api_key=OPENAI_API_KEY)
    ),
    # Set a system message with information that the LLM should always keep in mind
    # E.g. "Please give concise answers. Always respond in Spanish."
    system_message=None,
    # Set temperature to any value between 0 and 1. Higher values will make the LLM
    # more creative, while lower values will make it more predictable.
    temperature=0.0,
    # Custom commands let you map a prompt to a shortened slash command
    # They are like slash commands, but more easily defined - write just a prompt instead of a Step class
    # Their output will always be in chat form
    custom_commands=[
        # CustomCommand(
        #     name="test",
        #     description="Write unit tests for the higlighted code",
        #     prompt="Write a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
        # )
    ],
    # Slash commands let you run a Step from a slash command
    slash_commands=[
        # SlashCommand(
        #     name="commit",
        #     description="This is an example slash command. Use /config to edit it and create more",
        #     step=CommitMessageStep,
        # )
        SlashCommand(
            name="edit",
            description="Edit code in the current file or the highlighted code",
            step=EditHighlightedCodeStep,
        ),
        SlashCommand(
            name="config",
            description="Customize Continue - slash commands, LLMs, system message, etc.",
            step=OpenConfigStep,
        ),
        SlashCommand(
            name="comment",
            description="Write comments for the current file or highlighted code",
            step=CommentCodeStep,
        ),
        SlashCommand(
            name="feedback",
            description="Send feedback to improve Continue",
            step=FeedbackStep,
        ),
        SlashCommand(
            name="clear",
            description="Clear step history",
            step=ClearHistoryStep,
        ),
    ],
    # Context providers let you quickly select context by typing '@'
    # Uncomment the following to
    # - quickly reference GitHub issues
    # - show Google search results to the LLM
    context_providers=[
        # GitHubIssuesContextProvider(
        #     repo_name="<your github username or organization>/<your repo name>",
        #     auth_token="<your github auth token>"
        # ),
        GoogleContextProvider(
            serper_api_key="yyy"
        ),
        SearchContextProvider(),
        DiffContextProvider(),
    ],
    # Policies hold the main logic that decides which Step to take next
    # You can use them to design agents, or deeply customize Continue
    policy=DefaultPolicy(),
)

@mysticaltech
Copy link

I also have continue.OPENAI_API_KEY set to the same API key but to no avail. And the key works, just tested it.

@sestinj
Copy link
Contributor Author

sestinj commented Aug 8, 2023

@mysticaltech @Geczy I've been able to reproduce, definitely something going on here. Will update you once I figure it out.

@sestinj
Copy link
Contributor Author

sestinj commented Aug 8, 2023

@mysticaltech @Geczy I've just published a new version and tested it myself (v0.0.278), and it should solve the problem. Let me know if for some reason it doesn't.

The issue was that certificate bundles don't automatically get packaged when using pyinstaller, so I manually added them to the bundle and then set the appropriate environment variable. (Relevant commit here)

Also, will be fixing the readability issues in just a minute. We try to match the VS Code theme, but missed a spot there :)

@mysticaltech
Copy link

mysticaltech commented Aug 8, 2023

@sestinj Thanks for the quick fix, very much appreciated, will try it now.

Edit: Works like a charm! 🥳 🙏

@Geczy
Copy link

Geczy commented Aug 8, 2023

i'm on 0.0.279 but still have all the same issues

theme

CleanShot 2023-08-08 at 14 08 46@2x
HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))
Traceback (most recent call last):

  File "urllib3/connectionpool.py", line 703, in urlopen

  File "urllib3/connectionpool.py", line 386, in _make_request

  File "urllib3/connectionpool.py", line 1042, in _validate_conn

  File "urllib3/connection.py", line 419, in connect

  File "urllib3/util/ssl_.py", line 449, in ssl_wrap_socket

  File "urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl

  File "ssl.py", line 513, in wrap_socket

  File "ssl.py", line 1071, in _create

  File "ssl.py", line 1342, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "requests/adapters.py", line 486, in send

  File "urllib3/connectionpool.py", line 787, in urlopen

  File "urllib3/util/retry.py", line 592, in increment

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 24, in encoding_for_model
    return tiktoken.encoding_for_model(aliases.get(model_name, model_name))

  File "tiktoken/model.py", line 75, in encoding_for_model

  File "tiktoken/registry.py", line 63, in get_encoding

  File "tiktoken_ext/openai_public.py", line 64, in cl100k_base

  File "tiktoken/load.py", line 116, in load_tiktoken_bpe

  File "tiktoken/load.py", line 48, in read_file_cached

  File "tiktoken/load.py", line 24, in read_file

  File "requests/api.py", line 73, in get

  File "requests/api.py", line 59, in request

  File "requests/sessions.py", line 589, in request

  File "requests/sessions.py", line 703, in send

  File "requests/adapters.py", line 517, in send

requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "urllib3/connectionpool.py", line 703, in urlopen

  File "urllib3/connectionpool.py", line 386, in _make_request

  File "urllib3/connectionpool.py", line 1042, in _validate_conn

  File "urllib3/connection.py", line 419, in connect

  File "urllib3/util/ssl_.py", line 449, in ssl_wrap_socket

  File "urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl

  File "ssl.py", line 513, in wrap_socket

  File "ssl.py", line 1071, in _create

  File "ssl.py", line 1342, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "requests/adapters.py", line 486, in send

  File "urllib3/connectionpool.py", line 787, in urlopen

  File "urllib3/util/retry.py", line 592, in increment

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "continuedev/src/continuedev/core/autopilot.py", line 260, in _run_singular_step
    observation = await step(self.continue_sdk)

  File "continuedev/src/continuedev/core/main.py", line 326, in __call__
    return await self.run(sdk)

  File "continuedev/src/continuedev/plugins/steps/core/core.py", line 646, in run
    await self.stream_rif(rif, sdk)

  File "continuedev/src/continuedev/plugins/steps/core/core.py", line 340, in stream_rif
    file_prefix, contents, file_suffix, model_to_use, max_tokens = await self.get_prompt_parts(

  File "continuedev/src/continuedev/plugins/steps/core/core.py", line 210, in get_prompt_parts
    if model_to_use.count_tokens(rif.contents) > TOKENS_TO_BE_CONSIDERED_LARGE_RANGE:

  File "continuedev/src/continuedev/libs/llm/maybe_proxy_openai.py", line 53, in count_tokens
    return self.llm.count_tokens(text)

  File "continuedev/src/continuedev/libs/llm/proxy_server.py", line 62, in count_tokens
    return count_tokens(self.model, text)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 32, in count_tokens
    encoding = encoding_for_model(model_name)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 26, in encoding_for_model
    return tiktoken.encoding_for_model("gpt-3.5-turbo")

  File "tiktoken/model.py", line 75, in encoding_for_model

  File "tiktoken/registry.py", line 63, in get_encoding

  File "tiktoken_ext/openai_public.py", line 64, in cl100k_base

  File "tiktoken/load.py", line 116, in load_tiktoken_bpe

  File "tiktoken/load.py", line 48, in read_file_cached

  File "tiktoken/load.py", line 24, in read_file

  File "requests/api.py", line 73, in get

  File "requests/api.py", line 59, in request

  File "requests/sessions.py", line 589, in request

  File "requests/sessions.py", line 703, in send

  File "requests/adapters.py", line 517, in send

requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))

@sestinj
Copy link
Contributor Author

sestinj commented Aug 8, 2023

Awesome! I'll leave this issue open until @Geczy also confirms that everything is okay

Edit: See below, oops

@sestinj
Copy link
Contributor Author

sestinj commented Aug 8, 2023

@Geczy so sorry! I missed your earlier message when I sent this. It looks like tiktoken might look elsewhere for the SSL certificates. Back to working on this, I'll give an update soon

@sestinj
Copy link
Contributor Author

sestinj commented Aug 8, 2023

@Geczy I found an instance of this same error here and made the update. I've unfortunately not experienced the error yet, so haven't been able to verify, but there is a chance this will work. Newest version is v0.0.282, and also fixes the color theme problem.

@Geczy
Copy link

Geczy commented Aug 8, 2023

theme looks better!

CleanShot 2023-08-08 at 17 29 41@2x

still this error though

HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))
Traceback (most recent call last):

  File "urllib3/connectionpool.py", line 670, in urlopen

  File "urllib3/connectionpool.py", line 381, in _make_request

  File "urllib3/connectionpool.py", line 978, in _validate_conn

  File "urllib3/connection.py", line 362, in connect

  File "urllib3/util/ssl_.py", line 386, in ssl_wrap_socket

  File "ssl.py", line 513, in wrap_socket

  File "ssl.py", line 1071, in _create

  File "ssl.py", line 1342, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "requests/adapters.py", line 486, in send

  File "urllib3/connectionpool.py", line 726, in urlopen

  File "urllib3/util/retry.py", line 446, in increment

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 24, in encoding_for_model
    return tiktoken.encoding_for_model(aliases.get(model_name, model_name))

  File "tiktoken/model.py", line 75, in encoding_for_model

  File "tiktoken/registry.py", line 63, in get_encoding

  File "tiktoken_ext/openai_public.py", line 64, in cl100k_base

  File "tiktoken/load.py", line 116, in load_tiktoken_bpe

  File "tiktoken/load.py", line 48, in read_file_cached

  File "tiktoken/load.py", line 24, in read_file

  File "requests/api.py", line 73, in get

  File "requests/api.py", line 59, in request

  File "requests/sessions.py", line 589, in request

  File "requests/sessions.py", line 703, in send

  File "requests/adapters.py", line 517, in send

requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "urllib3/connectionpool.py", line 670, in urlopen

  File "urllib3/connectionpool.py", line 381, in _make_request

  File "urllib3/connectionpool.py", line 978, in _validate_conn

  File "urllib3/connection.py", line 362, in connect

  File "urllib3/util/ssl_.py", line 386, in ssl_wrap_socket

  File "ssl.py", line 513, in wrap_socket

  File "ssl.py", line 1071, in _create

  File "ssl.py", line 1342, in do_handshake

ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "requests/adapters.py", line 486, in send

  File "urllib3/connectionpool.py", line 726, in urlopen

  File "urllib3/util/retry.py", line 446, in increment

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "continuedev/src/continuedev/core/autopilot.py", line 260, in _run_singular_step
    observation = await step(self.continue_sdk)

  File "continuedev/src/continuedev/core/main.py", line 326, in __call__
    return await self.run(sdk)

  File "continuedev/src/continuedev/plugins/steps/core/core.py", line 646, in run
    await self.stream_rif(rif, sdk)

  File "continuedev/src/continuedev/plugins/steps/core/core.py", line 340, in stream_rif
    file_prefix, contents, file_suffix, model_to_use, max_tokens = await self.get_prompt_parts(

  File "continuedev/src/continuedev/plugins/steps/core/core.py", line 210, in get_prompt_parts
    if model_to_use.count_tokens(rif.contents) > TOKENS_TO_BE_CONSIDERED_LARGE_RANGE:

  File "continuedev/src/continuedev/libs/llm/maybe_proxy_openai.py", line 53, in count_tokens
    return self.llm.count_tokens(text)

  File "continuedev/src/continuedev/libs/llm/proxy_server.py", line 62, in count_tokens
    return count_tokens(self.model, text)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 32, in count_tokens
    encoding = encoding_for_model(model_name)

  File "continuedev/src/continuedev/libs/util/count_tokens.py", line 26, in encoding_for_model
    return tiktoken.encoding_for_model("gpt-3.5-turbo")

  File "tiktoken/model.py", line 75, in encoding_for_model

  File "tiktoken/registry.py", line 63, in get_encoding

  File "tiktoken_ext/openai_public.py", line 64, in cl100k_base

  File "tiktoken/load.py", line 116, in load_tiktoken_bpe

  File "tiktoken/load.py", line 48, in read_file_cached

  File "tiktoken/load.py", line 24, in read_file

  File "requests/api.py", line 73, in get

  File "requests/api.py", line 59, in request

  File "requests/sessions.py", line 589, in request

  File "requests/sessions.py", line 703, in send

  File "requests/adapters.py", line 517, in send

requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))

@sestinj
Copy link
Contributor Author

sestinj commented Aug 8, 2023

Alright, thanks for the quick update. Can I ask a few questions to help pin this down?

I'm curious whether you're on any sort of Wifi network / VPN that would be in any way relevant regarding SSL certificates, for example being on wifi at work that has a firewall.

The other thing I wonder is whether somehow we failed to start the newest server version. You might attempt to kill the server and reload VS Code to see if it works (though this is an unlikely solution).

@Geczy
Copy link

Geczy commented Aug 8, 2023

yeah i do have a work vpn on. turning it off yields the same issue. and my work laptop has some sort of other net thing installed called netskope that probably proxies all requests through it

@mysticaltech
Copy link

@Geczy You do need to close VSCode before trying again. Also, try to reboot just in case the continue server was not restarted with the latest version, that should work. And make sure your config.py looks more or less like mine.

@sestinj
Copy link
Contributor Author

sestinj commented Aug 28, 2023

@Geczy I finally got around to this, sorry it took so long. For a while wasn't clear to me that this was just an explicit blocking of the tiktoken URL on many firewalls. I ended up updating token counting to fall back to counting characters, which is a bit unsafe, but better than the alternative, which was complete failure. This is ready in v0.0.341, and should solve your problem.

I'll close the issue in about 2 days or after I hear back from you that this works, whichever comes first.

@sestinj sestinj closed this as completed Sep 25, 2023
@dosubot dosubot bot added the kind:bug Indicates an unexpected problem or unintended behavior label Jul 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:bug Indicates an unexpected problem or unintended behavior
Projects
None yet
Development

No branches or pull requests

3 participants