Giter VIP home page Giter VIP logo

freegpt35's Introduction

Docker Pulls

Utilize the unlimited free GPT-3.5-Turbo API service provided by the login-free ChatGPT Web.

Due to the frequent updates of OpenAI, I have once again created a new version, which is based on DuckDuckGo, and is GPT-3.5-Turbo-0125.

Repo: https://github.com/missuo/FreeDuckDuckGo

Deploy

Node

npm install
node app.js

Docker

docker run -p 3040:3040 ghcr.io/missuo/freegpt35
docker run -p 3040:3040 missuo/freegpt35

Docker Compose

Only FreeGPT35 Service

mkdir freegpt35 && cd freegpt35
wget -O compose.yaml https://raw.githubusercontent.com/missuo/FreeGPT35/main/compose/compose.yaml
docker compose up -d

FreeGPT35 Service with ChatGPT-Next-Web:

mkdir freegpt35 && cd freegpt35
wget -O compose.yaml https://raw.githubusercontent.com/missuo/FreeGPT35/main/compose/compose_with_next_chat.yaml
docker compose up -d

After deployment, you can directly access http://[IP]:3040/v1/chat/completions to use the API. Or use http://[IP]:3000 to directly use ChatGPT-Next-Web.

FreeGPT35 Service with lobe-chat:

mkdir freegpt35 && cd freegpt35
wget -O compose.yaml https://raw.githubusercontent.com/missuo/FreeGPT35/main/compose/compose_with_lobe_chat.yaml
docker compose up -d

After deployment, you can directly access http://[IP]:3040/v1/chat/completions to use the API. Or use http://[IP]:3210 to directly use lobe-chat.

Nginx Reverse Proxy

location ^~ / {
        proxy_pass http://127.0.0.1:3040; 
        proxy_set_header Host $host; 
        proxy_set_header X-Real-IP $remote_addr; 
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; 
        proxy_set_header REMOTE-HOST $remote_addr; 
        proxy_set_header Upgrade $http_upgrade; 
        proxy_set_header Connection "upgrade"; 
        proxy_http_version 1.1; 
        add_header Cache-Control no-cache; 
        proxy_cache off;
        proxy_buffering off;
        chunked_transfer_encoding on;
        tcp_nopush on;
        tcp_nodelay on;
        keepalive_timeout 300;
    }

Nginx Reverse Proxy with Load Balancer

upstream freegpt35 {
        server 1.1.1.1:3040;
        server 2.2.2.2:3040;
}

location ^~ / {
        proxy_pass http://freegpt35; 
        proxy_set_header Host $host; 
        proxy_set_header X-Real-IP $remote_addr; 
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; 
        proxy_set_header REMOTE-HOST $remote_addr; 
        proxy_set_header Upgrade $http_upgrade; 
        proxy_set_header Connection "upgrade"; 
        proxy_http_version 1.1; 
        add_header Cache-Control no-cache; 
        proxy_cache off;
        proxy_buffering off;
        chunked_transfer_encoding on;
        tcp_nopush on;
        tcp_nodelay on;
        keepalive_timeout 300;
    }

Request Example

You don't have to pass Authorization, of course, you can also pass any string randomly.

curl http://127.0.0.1:3040/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer any_string_you_like" \
  -d '{
    "model": "gpt-3.5-turbo",
    "messages": [
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "stream": true
    }'

Compatibility

You can use it in any app, such as OpenCat, Next-Chat, Lobe-Chat, Bob, etc. Feel free to fill in an API Key with any string, for example, gptyyds.

Bob

Bob

Credits

License

AGPL 3.0 License

freegpt35's People

Contributors

cl1107 avatar cliouo avatar hustcoderhu avatar missuo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

freegpt35's Issues

docker 容器启动不了

之前成功安装运行过。删除了重装, freegpt35-freegpt35-1 容器启动不了,一个多小时了,端口应该没有被占用。CMD显示:

✔ 981253dc047b Download complete 27.0s
✔ chatgpt-next-web 9 layers [⣿⣿⣿⣿⣿⣿⣿⣿⣿] 0B/0B Pulled 42.4s
✔ e7ced292c644 Pull complete 23.7s
✔ b32c0114bba5 Pull complete 17.9s
✔ f3748d9674b0 Pull complete 17.1s
✔ e4e3baea97d1 Pull complete 18.8s
✔ dc47e87a8622 Pull complete 22.8s
✔ 7f3fde39cb0d Pull complete 21.4s
✔ a78e3f3f84b7 Pull complete 25.7s
✔ b9d45a982e50 Pull complete 26.6s
✔ 981253dc047b Pull complete 27.0s
[+] Running 0/3

  • Network freegpt35_default Created 6567.7s
  • Container freegpt35-freegpt35-1 Starting 6567.7s
  • Container freegpt35-chatgpt-next-web-1 Created 6566.7s

容器如何配置代理选项

How to add proxy options to docker?
I have v2raya docker client, how do I configure the FreeGPT35 container to use v2raya's proxy

FreeGPT35(docker) + ChatGPT-Next-Web(docker) 搭建出现问题...

docker run -d --name free_gpt --restart=always -p 3040:3040 ghcr.io/missuo/freegpt35

docker run -d --name gpt_next_web -p 3000:3000 yidadaa/chatgpt-next-web

宿主机上执行如下正确返回:

curl http://172.17.0.2:3040/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer any_string_you_like" \
  -d '{
    "model": "gpt-3.5-turbo",
    "messages": [
      {
        "role": "user",
        "content": "你好啊!"
      }
    ],
    "stream": true
    }'

image

但是部署到 chatgpt-next-web (使用域名反向代理到了chatgpt-next-web 容器)中就无法使用:
image

image

Can't get session ID

Before everything was normal, it would print this line of log: System: Successfully refreshed session ID and token.

gpt response with empty content

curl http://127.0.0.1:3040/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer any_string_you_like" -d '{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Hello!"
}
],
"stream": true
}'

get a empty response with:
data: {"id":"chatcmpl-**********","created":1713965171742,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":""},"index":0,"finish_reason":"stop"}]}

返回内容为空

使用测试用例

curl http://ip:3040/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer any_string_you_like" \
  -d '{
    "model": "gpt-3.5-turbo",
    "messages": [
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "stream": true
    }'

发现返回的内容为空

data: {"id":"chatcmpl-9jKHVqNoeYznlG9q6iHJE7SkNQMM","created":1713580134677,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":""},"index":0,"finish_reason":"stop"}]}

在其他调用该api的应用中,同样收到回复,但是内容为空,所以是OpenAI新做了限制吗?

Error Solutions

  • Try changing servers and changing IP addresses.
  • It's best not to deploy with Node, as it may be affected by the Node version. It is recommended to use Docker or Docker Compose.
  • You can try using a WARP tunnel for outbound traffic.

The above methods may not necessarily work, but please give them a try.

US IP is not guaranteed to succeed!

实测好用,非常感谢

在一些需要填写配置的地方,key随便填(我之前作废的key也ok),然后endpoint填写自己部署的地址即可。速度很快。

ps 以前都是只能买号,然后用不了几天就作废了。

python client can not be used.

Since

client = OpenAI(
        api_key="sk-fwref", base_url=base_url
    )

    a = client.models.list()

the Client will call base_url model list inside to connect, but it cannot return a response, how to make it work with python client?

Error getting a new session, please try again later

这是为什么啊??搜索不到解决方案啊..

{
"status": false,
"error": {
"message": "Error getting a new session, please try again later, if the issue persists, please open an issue on the GitHub repository.",
"type": "invalid_request_error"
}
}

Error refreshing session ID, retrying in 1 minute...

Error refreshing session ID, retrying in 1 minute...
If this error persists, your country may not be supported yet.
If your country was the issue, please consider using a U.S. VPN.
Request: POST /v1/chat/completions 2 messages (stream-disabled)

你好,返回结果为什么是分多个数组元素

我让GPT总结一篇文章,然后每个字都分一条data,这样取值比较麻烦,为什么是这样返回的?是我请求不对吗

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":"阿"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":"尔"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":"玛"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":"・"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":"卡"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":"瑞"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-bjKI588Q3ykpHOEE33fcsW05hSfZ","created":1712407975839,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":","},"index":0,"finish_reason":null}]}

try about 15 min, i got a error

Error getting a new session, please try again later, if the issue persists, please open an issue on the GitHub repository

docker compose版本部署到墙外的vps上,报错{"status":false,"error":{"message":"An error happened, please make sure your request is SFW, or use a jailbreak to bypass the filte r.","type":"invalid_request_error"}}

curl http://127.0.0.1:3040/v1/chat/completions \

-H "Content-Type: application/json"
-H "Authorization: Bearer any_string_you_like"
-d '{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Hello!"
}
],
"stream": true
}'
{"status":false,"error":{"message":"An error happened, please make sure your request is SFW, or use a jailbreak to bypass the filte r.","type":"invalid_request_error"}}

An error happened, please make sure your request is SFW

{"status":false,"error":{"message":"An error happened, please make sure your request is SFW, or use a jailbreak to bypass the filter.","type":"invalid_request_error"}}

Thank you for your contribution, but I encountered some problems:

I just downloaded the source code, then ran node app.js, and made a curl request, and then got this error.

Thanks~

axios->fetch

大佬你好,可以做一个fetch的版本吗?我用axios会触发cloudflare校验,换成fetch就没问题,原因未知

NO Proxy func

The functionality to set HTTP or SOCKS proxies. I need. 3qs

License Compliance Issue

Hello @missuo

I noticed that this repository contains code derived from my project, PawanOsman/ChatGPT, which is licensed under AGPL 3.0. It appears that this repository is licensed under MIT, which is not compatible with AGPL 3.0 terms. Could we update the license here to align with AGPL 3.0 to ensure compliance?

Thank you for your understanding and cooperation.

Best,
Pawan Osman

support proxy[http/socks5]

Request: POST /v1/chat/completions 2 messages (stream-enabled)
Error refreshing session ID, retrying in 1 minute...
If this error persists, your country may not be supported yet.
If your country was the issue, please consider using a U.S. VPN.

No longer recieving responses from API

FreeAskInternet-API on 📦 v0.1.0 via 🐍 v3.11.6 (freeaskinternet-api-3.11) took 3m11s ❯ curl http://127.0.0.1:3040/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer any_string_you_like" \
  -d '{
    "model": "gpt-3.5-turbo",
    "messages": [
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "stream": true
    }'
data: {"id":"chatcmpl-TnQ2dqZfFq0wFzHBehcHUxPv158X","created":1713909080880,"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"delta":{"content":""},"index":0,"finish_reason":"stop"}]}

I am getting finish_reason="stop", implying that the response is prematurely ending.

连续对话会报错404

bad_response_status_code bad response status code 404 (request id: 2024040906334616828660341240500)

invalid_request_error

隔一段时间再用后,有提示。
{
"status": false,
"error": {
"message": "An error happened, please make sure your request is SFW, or use a jailbreak to bypass the filter.",
"type": "invalid_request_error"
}
}

请问如何保持同一个会话,是否有接口?

目前已经实现了对话,但是无法进行会话保持,比如让GPT出3道面试题,下一个问题要答案的时候,却没办法回答上一条的答案,接口中的API是否有参数可以设置,感谢!!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.