Giter VIP home page Giter VIP logo

Comments (7)

erlebach avatar erlebach commented on June 9, 2024 2

The code works. I had a spelling error. Thanks for getting me over the hump. Now I must explore the various models. Great job with the development of continue!

from continue.

sestinj avatar sestinj commented on June 9, 2024

@erlebach The first thing I notice is the URL http://localhost:8080:65432. Are you purposefully doing this for port forwarding? If not, your problem would be solved by replacing it with http://localhost:65432.

from continue.

erlebach avatar erlebach commented on June 9, 2024

@sestinj: that solved the problem. So right now, I am using the ChatGPT model with my API key. What I really want is to run llama.cpp with llama code on a remote machine with different IP address than my machine. I have tried this before but unsuccessfully. Is .continue/config.py the only file that requires modification, assuming I still start the continue server manually? Thanks.

from continue.

sestinj avatar sestinj commented on June 9, 2024

Yes, that is the only file you should have to edit. If you haven't already seen it, this is our reference page for using Llama.cpp: https://continue.dev/docs/reference/Models/llamacpp

This is if you're running the llama.cpp server, is that what you're doing?

I can also give better instructions on actually setting up that server and the options if that would help

from continue.

erlebach avatar erlebach commented on June 9, 2024

Thanks. I read the docs you refer to again. You write that config.py should contain the following:

config = ContinueConfig(
    ...
    models=Models(
        default=LlamaCpp(
            max_context_length=4096,
            server_url="http://localhost:8080")
    )
)

I assume that localhost should be replaced by the IP of the machine that Llama.cpp is running on? VisualStudioCode is running on localhost. Let me try again. If I have issues, I'll write another message.

from continue.

erlebach avatar erlebach commented on June 9, 2024

Here is my "other message" :-) with more details. Llama.cpp runs on a remote machine with IP address x.y.z.w . I execute the Llama.cpp server with the following command:

./server -m models/codellama-7b-instruct.Q4_K_M.gguf --host 0.0.0.0

I modify the config.py file and then execute ./dist/run. Here is the relevant section of the config.py file:

"""
This is the Continue configuration file.

See https://continue.dev/docs/customization to for documentation of the available options.
"""

from continuedev.src.continuedev.core.models import Models
from continuedev.src.continuedev.core.config import CustomCommand, SlashCommand, ContinueConfig
from continuedev.src.continuedev.plugins.context_providers.github import GitHubIssuesContextProvider
from continuedev.src.continuedev.libs.llm.maybe_proxy_openai import MaybeProxyOpenAI

from continuedev.src.continuedev.libs.llm.llamacpp import LlamaCpp

from continuedev.src.continuedev.plugins.steps.open_config import OpenConfigStep
from continuedev.src.continuedev.plugins.steps.clear_history import ClearHistoryStep
from continuedev.src.continuedev.plugins.steps.feedback import FeedbackStep
from continuedev.src.continuedev.plugins.steps.comment_code import CommentCodeStep
from continuedev.src.continuedev.plugins.steps.share_session import ShareSessionStep
from continuedev.src.continuedev.plugins.steps.main import EditHighlightedCodeStep
from continuedev.src.continuedev.plugins.steps.cmd import GenerateShellCommandStep
from continuedev.src.continuedev.plugins.context_providers.search import SearchContextProvider
from continuedev.src.continuedev.plugins.context_providers.diff import DiffContextProvider
from continuedev.src.continuedev.plugins.context_providers.url import URLContextProvider
from continuedev.src.continuedev.plugins.context_providers.terminal import TerminalContextProvider

config = ContinueConfig(
    allow_anonymous_telemetry=True,
    models=Models(
        default=LLamaCpp(
            max_context_length = 1024,
            server_url="http://x.y.z.w:8080"
        )
    ),
    system_message="",
    temperature=0.5,

The URL of the Continue server (in the VSC settings), remains https://localhost:65432.

Finally, here is the stacktrace I get when I click on the continue extension in VSC:

Invalid Continue Config File

Falling back to default config settings due to the following error in ~/.continue/config.py.

Invalid Continue Config File

Falling back to default config settings due to the following error in ~/.continue/config.py.


Falling back to default config settings due to the following error in ~/.continue/config.py.

Traceback (most recent call last):

File "continuedev/src/continuedev/core/sdk.py", line 75, in create
sdk.config = config or sdk._load_config_dot_py()

File "continuedev/src/continuedev/core/sdk.py", line 242, in _load_config_dot_py
config = ContinueConfig.from_filepath(path)

File "continuedev/src/continuedev/core/config.py", line 112, in from_filepath
spec.loader.exec_module(config)

File "", line 883, in exec_module

File "", line 241, in _call_with_frames_removed

File "/Users/erlebach/.continue/config.py", line 30, in
default=LLamaCpp(

NameError: name 'LLamaCpp' is not defined

I get I have to do a `pip install LlamaCpp`? I did not see that in the instructions. 

Thanks!

from continue.

sestinj avatar sestinj commented on June 9, 2024

Glad to hear! If you have any more questions don't hesitate to ask in our Discord! https://discord.gg/RKcHQgxy5N

from continue.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.