Giter VIP home page Giter VIP logo

vroid-dataset's Introduction

vroid-dataset

This repo downloads the Vroid 3D models dataset introduced in PAniC-3D: Stylized Single-view 3D Reconstruction from Portraits of Anime Characters. As described in that repo, this downloader will add to ./_data/lustrous

download

Download the panic_data_models_merged.zip from the project's drive folder, and merge it with this repo's file structure. There should be a ./_data/lustrous/raw/vroid/metadata.json, which the following commands will use to download the models. Note that metadata.json also contains all vroid model attributions.

Then, get your Vroid hub cookie following these steps:

1) login to https://hub.vroid.com/en/ on chrome
2) open devtools (f12)
3) go to Application > Cookies > https://hub.vroid.com/en/
4) copy the value of `_vroid_session`
5) replace the cookie value in `./_env/vroid_cookie.bashrc` with your new cookie:
cp ./_env/vroid_cookie_template.bashrc ./_env/vroid_cookie.bashrc

Finally, build the container and run the scraper:

# build the container 
docker-compose build

# run the downloader
bash ./run.sh

citing

If you use our repo, please cite our work:

@inproceedings{chen2023panic3d,
    title={PAniC-3D: Stylized Single-view 3D Reconstruction from Portraits of Anime Characters},
    author={Chen, Shuhong and Zhang, Kevin and Shi, Yichun and Wang, Heng and Zhu, Yiheng and Song, Guoxian and An, Sizhe and Kristjansson, Janus and Yang, Xiao and Matthias Zwicker},
    booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    year={2023}
}

vroid-dataset's People

Contributors

royalcookie avatar shuhongchen avatar

Stargazers

 avatar  avatar  avatar LordLiang avatar  avatar  avatar ZZSSCC  avatar Jeff Sipko avatar  avatar  avatar ㅎㅎ avatar bitm avatar Fancomi avatar yuna0x0 (edisonlee55) avatar universe avatar  avatar Taehoon Kim avatar HardBoileDon avatar Ash avatar  avatar

Watchers

 avatar  avatar

Forkers

shiro132

vroid-dataset's Issues

ERROR: Couldn't connect to Docker daemon at http+docker://localhost - is it running?

hello,
I'm running "docker-compose build" and I get these warning and error:

WARNING: The MAX_MODELS variable is not set. Defaulting to a blank string.
WARNING: The MODE variable is not set. Defaulting to a blank string.
WARNING: The COOKIE variable is not set. Defaulting to a blank string.
WARNING: The JSON_FILE variable is not set. Defaulting to a blank string.
ERROR: Couldn't connect to Docker daemon at http+docker://localhost - is it running?

If it's at a non-standard location, specify the URL with the DOCKER_HOST environment variable.

Can you please provide some advises? Thank you.

請問我該如何從VRoid Hub取得metadata JSON?

我按照教學,設置了vroid_cookie.bashrc文件,
然後依照指示修改run.sh(export MODE=c),
之後執行bash ./run.sh,
發現他只是在.\data下生成一個完全參照._data\lustrous\raw\vroid\metadata.json的datmodels.json
但我仍然無法取得自己所希望VRoid Hub上的模型的metadata.json。
我在TERMINAL中也成功確認print出來的self.cookies是我設定的沒錯,self.mode也是c

請問我該如何設置?謝謝您!

Connection aborted.', FileNotFoundError(2, 'No such file or directory')

I am getting an error, see below:
Using code on a MacOS and I recently just installed docker, am I missing something needed to do the docker build?

Tutnyals-MacBook-Pro:vroid-dataset tutnyal$ docker-compose build
WARNING: The MAX_MODELS variable is not set. Defaulting to a blank string.
WARNING: The MODE variable is not set. Defaulting to a blank string.
WARNING: The COOKIE variable is not set. Defaulting to a blank string.
WARNING: The JSON_FILE variable is not set. Defaulting to a blank string.
Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 703, in urlopen
    httplib_response = self._make_request(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 398, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connection.py", line 239, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1282, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1328, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1277, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1037, in _send_output
    self.send(msg)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 975, in send
    self.connect()
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/transport/unixconn.py", line 27, in connect
    sock.connect(self.unix_socket)
FileNotFoundError: [Errno 2] No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/requests/adapters.py", line 489, in send
    resp = conn.urlopen(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 785, in urlopen
    retries = retries.increment(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/util/retry.py", line 550, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/packages/six.py", line 769, in reraise
    raise value.with_traceback(tb)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 703, in urlopen
    httplib_response = self._make_request(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 398, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/urllib3/connection.py", line 239, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1282, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1328, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1277, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1037, in _send_output
    self.send(msg)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 975, in send
    self.connect()
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/transport/unixconn.py", line 27, in connect
    sock.connect(self.unix_socket)
urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/api/client.py", line 214, in _retrieve_server_version
    return self.version(api_version=False)["ApiVersion"]
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/api/daemon.py", line 181, in version
    return self._result(self._get(url), json=True)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/utils/decorators.py", line 46, in inner
    return f(self, *args, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/api/client.py", line 237, in _get
    return self.get(url, **self._set_request_timeout(kwargs))
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/requests/sessions.py", line 600, in get
    return self.request("GET", url, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/requests/sessions.py", line 587, in request
    resp = self.send(prep, **send_kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/requests/sessions.py", line 701, in send
    r = adapter.send(request, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/requests/adapters.py", line 547, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.10/bin/docker-compose", line 8, in <module>
    sys.exit(main())
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/compose/cli/main.py", line 81, in main
    command_func()
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/compose/cli/main.py", line 200, in perform_command
    project = project_from_options('.', options)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/compose/cli/command.py", line 60, in project_from_options
    return get_project(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/compose/cli/command.py", line 152, in get_project
    client = get_client(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/compose/cli/docker_client.py", line 41, in get_client
    client = docker_client(
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/compose/cli/docker_client.py", line 170, in docker_client
    client = APIClient(use_ssh_client=not use_paramiko_ssh, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/api/client.py", line 197, in __init__
    self._version = self._retrieve_server_version()
  File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/docker/api/client.py", line 221, in _retrieve_server_version
    raise DockerException(
docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

Dataset downloading problem

Thanks for this great works !
I wonder that if there is other way for downloading the multi-view images vroid 3d dataset ,
without using the docker container? maybe somethings like Google Drive or somethings other...
Thanks again!

how to work on windows system?

i finished docker-compose build and i turn run.sh to 1.bat. but it didn't work

"""
set MODE=d
set MAX_MODELS=20000
set JSON_FILE=D:\x\vroid-dataset-master_data\lustrous\raw\vroid\metadata.json
set COOKIE=600xxxxxxxxxx53

docker-compose up

docker-compose down
"""

time="2024-07-19T16:14:13+08:00" level=warning msg="D:\x\vroid-dataset-master\vroid-dataset-master\docker-compose.yml: version is obsolete"
[+] Running 2/2
✔ Network vroid-dataset-master_default Created 0.0s
✔ Container scrapy Created 0.1s
Attaching to scrapy
scrapy |
scrapy |
scrapy | ========== Initializing Spider ==========
scrapy | Loading Model List From File: D:\x\vroid-dataset-master_data\lustrous\raw\vroid\metadata.json
scrapy | Unhandled error in Deferred:
scrapy | 2024-07-19 08:14:13 [twisted] CRITICAL: Unhandled error in Deferred:
scrapy |
scrapy | Traceback (most recent call last):
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/crawler.py", line 192, in crawl
scrapy | return self._crawl(crawler, *args, **kwargs)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/crawler.py", line 196, in _crawl
scrapy | d = crawler.crawl(*args, **kwargs)
scrapy | File "/usr/local/lib/python3.9/site-packages/twisted/internet/defer.py", line 1909, in unwindGenerator
scrapy | return _cancellableInlineCallbacks(gen) # type: ignore[unreachable]
scrapy | File "/usr/local/lib/python3.9/site-packages/twisted/internet/defer.py", line 1816, in _cancellableInlineCallbacks
scrapy | _inlineCallbacks(None, gen, status)
scrapy | --- ---
scrapy | File "/usr/local/lib/python3.9/site-packages/twisted/internet/defer.py", line 1661, in _inlineCallbacks
scrapy | result = current_context.run(gen.send, result)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/crawler.py", line 86, in crawl
scrapy | self.spider = self._create_spider(*args, **kwargs)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/crawler.py", line 98, in _create_spider
scrapy | return self.spidercls.from_crawler(self, *args, **kwargs)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/spiders/init.py", line 50, in from_crawler
scrapy | spider = cls(*args, **kwargs)
scrapy | File "/app/VRoidSpider/spiders/VRoidSpider.py", line 69, in init
scrapy | with open(json_file) as in_file:
scrapy | builtins.FileNotFoundError: [Errno 2] No such file or directory: 'D:\x\vroid-dataset-master\_data\lustrous\raw\vroid\metadata.json'
scrapy |
scrapy | 2024-07-19 08:14:13 [twisted] CRITICAL:
scrapy | Traceback (most recent call last):
scrapy | File "/usr/local/lib/python3.9/site-packages/twisted/internet/defer.py", line 1661, in _inlineCallbacks
scrapy | result = current_context.run(gen.send, result)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/crawler.py", line 86, in crawl
scrapy | self.spider = self._create_spider(*args, **kwargs)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/crawler.py", line 98, in _create_spider
scrapy | return self.spidercls.from_crawler(self, *args, **kwargs)
scrapy | File "/usr/local/lib/python3.9/site-packages/scrapy/spiders/init.py", line 50, in from_crawler
scrapy | spider = cls(*args, **kwargs)
scrapy | File "/app/VRoidSpider/spiders/VRoidSpider.py", line 69, in init
scrapy | with open(json_file) as in_file:
scrapy | FileNotFoundError: [Errno 2] No such file or directory: 'D:\x\vroid-dataset-master\_data\lustrous\raw\vroid\metadata.json'
scrapy exited with code 1

D:\x\vroid-dataset-master>docker-compose down
time="2024-07-19T16:14:14+08:00" level=warning msg="D:\x\vroid-dataset-master\docker-compose.yml: version is obsolete"
[+] Running 2/2
✔ Container scrapy Removed 0.0s
✔ Network vroid-dataset-master_default Removed

docker-compose build 失败,原因为Python版本太新

build lxml 插件失败,原因为Python版本太新。建议在Dockerfile中指定Python版本,比如 FROM python:3.9,即可成功。
失败输出:

WARN[0000] The "MAX_MODELS" variable is not set. Defaulting to a blank string.
WARN[0000] The "MODE" variable is not set. Defaulting to a blank string.
WARN[0000] The "COOKIE" variable is not set. Defaulting to a blank string.
WARN[0000] The "JSON_FILE" variable is not set. Defaulting to a blank string.
[+] Building 32.6s (8/8) FINISHED
 => [internal] load build definition from Dockerfile                                                               0.1s
 => => transferring dockerfile: 32B                                                                                0.0s
 => [internal] load .dockerignore                                                                                  0.0s
 => => transferring context: 2B                                                                                    0.0s
 => [internal] load metadata for docker.io/library/python:latest                                                   2.7s
 => [1/4] FROM docker.io/library/python@sha256:7adb2f6d6b0fdaf2d3029c42b5a40833589f969c18728f5b5b126a61394848b6    0.0s
 => [internal] load build context                                                                                  0.1s
 => => transferring context: 851B                                                                                  0.0s
 => CACHED [2/4] COPY . /app                                                                                       0.0s
 => CACHED [3/4] WORKDIR /app                                                                                      0.0s
 => ERROR [4/4] RUN pip install -r requirements.txt -i https://mirrors.ustc.edu.cn/pypi/web/simple/               29.8s
------
 > [4/4] RUN pip install -r requirements.txt -i https://mirrors.ustc.edu.cn/pypi/web/simple/:
#0 1.439 Looking in indexes: https://mirrors.ustc.edu.cn/pypi/web/simple/
#0 2.621 Collecting attrs==21.2.0
#0 2.919   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/20/a9/ba6f1cd1a1517ff022b35acd6a7e4246371dfab08b8e42b829b6d07913cc/attrs-21.2.0-py2.py3-none-any.whl (53 kB)
#0 2.967      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 53.7/53.7 kB 963.2 kB/s eta 0:00:00
#0 3.048 Collecting Automat==20.2.0
#0 3.193   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/dd/83/5f6f3c1a562674d65efc320257bdc0873ec53147835aeef7762fe7585273/Automat-20.2.0-py2.py3-none-any.whl (31 kB)
#0 3.293 Collecting certifi==2021.10.8
#0 3.333   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/37/45/946c02767aabb873146011e665728b680884cd8fe70dde973c640e45b775/certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
#0 3.395      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 149.2/149.2 kB 2.3 MB/s eta 0:00:00
#0 3.648 Collecting cffi==1.15.0
#0 3.711   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/00/9e/92de7e1217ccc3d5f352ba21e52398372525765b2e0c4530e6eb2ba9282a/cffi-1.15.0.tar.gz (484 kB)
#0 3.809      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 484.1/484.1 kB 4.9 MB/s eta 0:00:00
#0 3.860   Preparing metadata (setup.py): started
#0 4.286   Preparing metadata (setup.py): finished with status 'done'
#0 4.372 Collecting constantly==15.1.0
#0 4.406   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/b9/65/48c1909d0c0aeae6c10213340ce682db01b48ea900a7d9fce7a7910ff318/constantly-15.1.0-py2.py3-none-any.whl (7.9 kB)
#0 4.732 Collecting cryptography==35.0.0
#0 4.768   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/7b/1a/bf49bade5080a5cfb226a975c118fc56c3df2878b91809a5030dd87e551b/cryptography-35.0.0-cp36-abi3-manylinux_2_24_x86_64.whl (3.5 MB)
#0 6.166      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.5/3.5 MB 2.5 MB/s eta 0:00:00
#0 6.249 Collecting cssselect==1.1.0
#0 6.311   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/3b/d4/3b5c17f00cce85b9a1e6f91096e1cc8e8ede2e1be8e96b87ce1ed09e92c5/cssselect-1.1.0-py2.py3-none-any.whl (16 kB)
#0 6.407 Collecting h2==3.2.0
#0 6.442   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/25/de/da019bcc539eeab02f6d45836f23858ac467f584bfec7a526ef200242afe/h2-3.2.0-py2.py3-none-any.whl (65 kB)
#0 6.452      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 65.0/65.0 kB 6.8 MB/s eta 0:00:00
#0 6.531 Collecting hpack==3.0.0
#0 6.567   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/8a/cc/e53517f4a1e13f74776ca93271caef378dadec14d71c61c949d759d3db69/hpack-3.0.0-py2.py3-none-any.whl (38 kB)
#0 6.651 Collecting hyperframe==5.2.0
#0 6.686   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/19/0c/bf88182bcb5dce3094e2f3e4fe20db28a9928cb7bd5b08024030e4b140db/hyperframe-5.2.0-py2.py3-none-any.whl (12 kB)
#0 6.766 Collecting hyperlink==21.0.0
#0 6.802   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/6e/aa/8caf6a0a3e62863cbb9dab27135660acba46903b703e224f14f447e57934/hyperlink-21.0.0-py2.py3-none-any.whl (74 kB)
#0 6.811      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 74.6/74.6 kB 8.3 MB/s eta 0:00:00
#0 6.898 Collecting idna==3.3
#0 6.938   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/04/a2/d918dcd22354d8958fe113e1a3630137e0fc8b44859ade3063982eacd2a4/idna-3.3-py3-none-any.whl (61 kB)
#0 6.944      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.2/61.2 kB 12.8 MB/s eta 0:00:00
#0 7.022 Collecting incremental==21.3.0
#0 7.057   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/99/3b/4f80dd10cb716f3a9e22ae88f026d25c47cc3fdf82c2747f3d59c98e4ff1/incremental-21.3.0-py2.py3-none-any.whl (15 kB)
#0 7.140 Collecting itemadapter==0.4.0
#0 7.184   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/7f/a9/017e825977a8a6447ef3e3144c11fc70ffadb0d998ade8ed5419b9fa0447/itemadapter-0.4.0-py3-none-any.whl (10 kB)
#0 7.266 Collecting itemloaders==1.0.4
#0 7.314   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/b3/2b/eb2ddf7becf834679273a6f79ffdc6fbedf07c5272e2eddf412582143c0e/itemloaders-1.0.4-py3-none-any.whl (11 kB)
#0 7.403 Collecting jmespath==0.10.0
#0 7.448   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/07/cb/5f001272b6faeb23c1c9e0acc04d48eaaf5c862c17709d20e3469c6e0139/jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
#0 7.780 Collecting lxml==4.6.3
#0 7.831   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/e5/21/a2e4517e3d216f0051687eea3d3317557bde68736f038a3b105ac3809247/lxml-4.6.3.tar.gz (3.2 MB)
#0 9.093      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.2/3.2 MB 2.5 MB/s eta 0:00:00
#0 9.300   Preparing metadata (setup.py): started
#0 9.547   Preparing metadata (setup.py): finished with status 'done'
#0 9.637 Collecting parsel==1.6.0
#0 9.689   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/23/1e/9b39d64cbab79d4362cdd7be7f5e9623d45c4a53b3f7522cd8210df52d8e/parsel-1.6.0-py2.py3-none-any.whl (13 kB)
#0 9.775 Collecting priority==1.3.0
#0 9.815   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/de/96/2f4b8da7be255cd41e825c398efd11a6706ff86e66ae198f012204aa2a4f/priority-1.3.0-py2.py3-none-any.whl (11 kB)
#0 9.896 Collecting Protego==0.1.16
#0 9.962   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/db/6e/bf6d5e4d7cf233b785719aaec2c38f027b9c2ed980a0015ec1a1cced4893/Protego-0.1.16.tar.gz (3.2 MB)
#0 11.26      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.2/3.2 MB 2.5 MB/s eta 0:00:00
#0 11.79   Preparing metadata (setup.py): started
#0 12.02   Preparing metadata (setup.py): finished with status 'done'
#0 12.15 Collecting pyasn1==0.4.8
#0 12.19   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/62/1e/a94a8d635fa3ce4cfc7f506003548d0a2447ae76fd5ca53932970fe3053f/pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
#0 12.19      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 77.1/77.1 kB 20.5 MB/s eta 0:00:00
#0 12.29 Collecting pyasn1-modules==0.2.8
#0 12.35   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/95/de/214830a981892a3e286c3794f41ae67a4495df1108c3da8a9f62159b9a9d/pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
#0 12.36      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 kB 21.0 MB/s eta 0:00:00
#0 12.44 Collecting pycparser==2.20
#0 12.48   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/ae/e7/d9c3a176ca4b02024debf82342dab36efadfc5776f9c8db077e8f6e71821/pycparser-2.20-py2.py3-none-any.whl (112 kB)
#0 12.49      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.0/112.0 kB 15.4 MB/s eta 0:00:00
#0 12.61 Collecting PyDispatcher==2.0.5
#0 12.66   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/f5/6f/17cee8b82ea6f6938052133dfa06384da73407d8b13c5b83ea9010136509/PyDispatcher-2.0.5.zip (47 kB)
#0 12.67      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 47.6/47.6 kB 5.9 MB/s eta 0:00:00
#0 12.68   Preparing metadata (setup.py): started
#0 12.83   Preparing metadata (setup.py): finished with status 'done'
#0 12.94 Collecting pyOpenSSL==21.0.0
#0 12.98   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/85/3a/fe3c98435856a1ed798977981f3da82d2685cf9df97e4d9546340d2b83db/pyOpenSSL-21.0.0-py2.py3-none-any.whl (55 kB)
#0 12.99      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 55.1/55.1 kB 13.0 MB/s eta 0:00:00
#0 13.07 Collecting queuelib==1.6.2
#0 13.11   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/06/1e/9e3bfb6a10253f5d95acfed9c5732f4abc2ef87bdf985594ddfb99d222da/queuelib-1.6.2-py2.py3-none-any.whl (13 kB)
#0 13.21 Collecting Scrapy==2.5.1
#0 13.29   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/22/26/497ef936b54ae00e24694934cb25d524061722bb0d8582da33430e3e7608/Scrapy-2.5.1-py2.py3-none-any.whl (254 kB)
#0 13.32      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 255.0/255.0 kB 9.9 MB/s eta 0:00:00
#0 13.40 Collecting service-identity==21.1.0
#0 13.44   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/93/5a/5e93f280ec7be676b5a57f305350f439d31ced168bca04e6ffa64b575664/service_identity-21.1.0-py2.py3-none-any.whl (12 kB)
#0 13.53 Collecting six==1.16.0
#0 13.57   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/d9/5a/e7c31adbe875f2abbb91bd84cf2dc52d792b5a01506781dbcf25c91daf11/six-1.16.0-py2.py3-none-any.whl (11 kB)
#0 13.70 Collecting tqdm==4.62.3
#0 13.76   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/63/f3/b7a1b8e40fd1bd049a34566eb353527bb9b8e9b98f8b6cf803bb64d8ce95/tqdm-4.62.3-py2.py3-none-any.whl (76 kB)
#0 13.77      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 76.2/76.2 kB 10.3 MB/s eta 0:00:00
#0 14.15 Collecting Twisted==21.7.0
#0 14.24   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/b5/0c/7a1a943edce164c77c98f044175d801b572bb36936e9b4d5805a850525e7/Twisted-21.7.0-py3-none-any.whl (3.1 MB)
#0 15.42      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 2.6 MB/s eta 0:00:00
#0 15.52 Collecting typing-extensions==3.10.0.2
#0 15.62   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/74/60/18783336cc7fcdd95dae91d73477830aa53f5d3181ae4fe20491d7fc3199/typing_extensions-3.10.0.2-py3-none-any.whl (26 kB)
#0 15.71 Collecting w3lib==1.22.0
#0 15.78   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/a3/59/b6b14521090e7f42669cafdb84b0ab89301a42f1f1a82fcf5856661ea3a7/w3lib-1.22.0-py2.py3-none-any.whl (20 kB)
#0 16.07 Collecting zope.interface==5.4.0
#0 16.11   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/ae/58/e0877f58daa69126a5fb325d6df92b20b77431cd281e189c5ec42b722f58/zope.interface-5.4.0.tar.gz (249 kB)
#0 16.14      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 249.3/249.3 kB 11.9 MB/s eta 0:00:00
#0 16.16   Preparing metadata (setup.py): started
#0 16.34   Preparing metadata (setup.py): finished with status 'done'
#0 16.47 Collecting Twisted[http2]>=17.9.0
#0 16.52   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/ac/63/b5540d15dfeb7388fbe12fa55a902c118fd2b324be5430cdeac0c0439489/Twisted-22.10.0-py3-none-any.whl (3.1 MB)
#0 17.72      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 2.6 MB/s eta 0:00:00
#0 17.91 Requirement already satisfied: setuptools in /usr/local/lib/python3.11/site-packages (from zope.interface==5.4.0->-r requirements.txt (line 34)) (65.5.1)
#0 18.08   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/86/83/e066b4a37e184906ba76e5d3a54a20d893b13d2fac4b35ab5b2545096111/Twisted-22.8.0-py3-none-any.whl (3.1 MB)
#0 19.28      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 2.6 MB/s eta 0:00:00
#0 19.43   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/db/99/38622ff95bb740bcc991f548eb46295bba62fcb6e907db1987c4d92edd09/Twisted-22.4.0-py3-none-any.whl (3.1 MB)
#0 20.68      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 2.5 MB/s eta 0:00:00
#0 20.84   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/2b/ba/17bff991d6f1df1d123611ef2a0e72636ae400e7e75eddbe85b944567d0a/Twisted-22.2.0-py3-none-any.whl (3.1 MB)
#0 21.97      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 2.7 MB/s eta 0:00:00
#0 22.17   Downloading https://mirrors.bfsu.edu.cn/pypi/web/packages/7c/2b/df1c552adff67d9ec348295a76b6496178d6ca0dc6a033fd8ee681accdea/Twisted-22.1.0-py3-none-any.whl (3.1 MB)
#0 23.35      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 2.6 MB/s eta 0:00:00
#0 23.56 Building wheels for collected packages: cffi, lxml, Protego, PyDispatcher, zope.interface
#0 23.56   Building wheel for cffi (setup.py): started
#0 26.29   Building wheel for cffi (setup.py): finished with status 'done'
#0 26.30   Created wheel for cffi: filename=cffi-1.15.0-cp311-cp311-linux_x86_64.whl size=440176 sha256=5c7d1a48955c67a36578c15d1ff1b00a7601214594927251befc3ba02be112e9
#0 26.30   Stored in directory: /root/.cache/pip/wheels/70/a1/90/a88f41a070f4fc2f411be6592c6eca3d2301dc6bb7307e6de3
#0 26.30   Building wheel for lxml (setup.py): started
#0 26.60   Building wheel for lxml (setup.py): finished with status 'error'
#0 26.61   error: subprocess-exited-with-error
#0 26.61
#0 26.61   × python setup.py bdist_wheel did not run successfully.
#0 26.61   │ exit code: 1
#0 26.61   ╰─> [86 lines of output]
#0 26.61       Building lxml version 4.6.3.
#0 26.61       Building without Cython.
#0 26.61       Building against libxml2 2.9.10 and libxslt 1.1.34
#0 26.61       running bdist_wheel
#0 26.61       running build
#0 26.61       running build_py
#0 26.61       creating build
#0 26.61       creating build/lib.linux-x86_64-cpython-311
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/sax.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/doctestcompare.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/pyclasslookup.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/ElementInclude.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/cssselect.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/_elementpath.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/builder.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/usedoctest.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/_setmixin.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/_diffcommand.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/diff.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/formfill.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/clean.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/_html5builder.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/soupparser.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/builder.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/defs.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/usedoctest.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/ElementSoup.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       copying src/lxml/html/html5parser.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron
#0 26.61       copying src/lxml/isoschematron/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron
#0 26.61       copying src/lxml/etree.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/etree_api.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/lxml.etree.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/lxml.etree_api.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 26.61       copying src/lxml/includes/__init__.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/uri.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/config.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/xslt.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/tree.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/schematron.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/xpath.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/c14n.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/relaxng.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/xinclude.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/etree_defs.h -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       copying src/lxml/includes/lxml-version.h -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/rng
#0 26.61       copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/rng
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl
#0 26.61       copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl
#0 26.61       copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl
#0 26.61       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 26.61       running build_ext
#0 26.61       building 'lxml.etree' extension
#0 26.61       creating build/temp.linux-x86_64-cpython-311
#0 26.61       creating build/temp.linux-x86_64-cpython-311/src
#0 26.61       creating build/temp.linux-x86_64-cpython-311/src/lxml
#0 26.61       gcc -pthread -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DCYTHON_CLINE_IN_TRACEBACK=0 -I/usr/include/libxml2 -Isrc -Isrc/lxml/includes -I/usr/local/include/python3.11 -c src/lxml/etree.c -o build/temp.linux-x86_64-cpython-311/src/lxml/etree.o -w
#0 26.61       src/lxml/etree.c:289:12: fatal error: longintrepr.h: No such file or directory
#0 26.61         289 |   #include "longintrepr.h"
#0 26.61             |            ^~~~~~~~~~~~~~~
#0 26.61       compilation terminated.
#0 26.61       Compile failed: command '/usr/bin/gcc' failed with exit code 1
#0 26.61       creating tmp
#0 26.61       cc -I/usr/include/libxml2 -I/usr/include/libxml2 -c /tmp/xmlXPathInitugx5l0_u.c -o tmp/xmlXPathInitugx5l0_u.o
#0 26.61       cc tmp/xmlXPathInitugx5l0_u.o -lxml2 -o a.out
#0 26.61       error: command '/usr/bin/gcc' failed with exit code 1
#0 26.61       [end of output]
#0 26.61
#0 26.61   note: This error originates from a subprocess, and is likely not a problem with pip.
#0 26.61   ERROR: Failed building wheel for lxml
#0 26.61   Running setup.py clean for lxml
#0 26.79   Building wheel for Protego (setup.py): started
#0 27.11   Building wheel for Protego (setup.py): finished with status 'done'
#0 27.12   Created wheel for Protego: filename=Protego-0.1.16-py3-none-any.whl size=7759 sha256=64e2a47c29beb81fe619b820fcd9ab3052bc6638e1373c7a6f3adfab6ab08ea7
#0 27.12   Stored in directory: /root/.cache/pip/wheels/5f/8b/d3/e6ebcba02692360181eab0fd7adece9e130a4a5210168533cc
#0 27.12   Building wheel for PyDispatcher (setup.py): started
#0 27.31   Building wheel for PyDispatcher (setup.py): finished with status 'done'
#0 27.31   Created wheel for PyDispatcher: filename=PyDispatcher-2.0.5-py3-none-any.whl size=11517 sha256=207389ecfb4f534afc3ef52479d76547c0e9bc43e77152de9a56ab3861905922
#0 27.31   Stored in directory: /root/.cache/pip/wheels/4b/8f/cf/5cf8ec1ae6c7fd36ea5bbd2925413989f297433857bdb71f7c
#0 27.32   Building wheel for zope.interface (setup.py): started
#0 27.94   Building wheel for zope.interface (setup.py): finished with status 'done'
#0 27.94   Created wheel for zope.interface: filename=zope.interface-5.4.0-cp311-cp311-linux_x86_64.whl size=254093 sha256=829ac7e0ee8ef0509c29e588b6abc0e57da5babaca6cd30db7a52de8437a5318
#0 27.94   Stored in directory: /root/.cache/pip/wheels/06/46/df/9ecdafc8bfa9f09a9d27a8ea7cca3c04296fa6d9697939f442
#0 27.94 Successfully built cffi Protego PyDispatcher zope.interface
#0 27.94 Failed to build lxml
#0 28.16 Installing collected packages: typing-extensions, PyDispatcher, pyasn1, priority, incremental, hyperframe, hpack, constantly, certifi, zope.interface, tqdm, six, queuelib, pycparser, pyasn1-modules, lxml, jmespath, itemadapter, idna, h2, cssselect, attrs, w3lib, Protego, hyperlink, cffi, Automat, Twisted, parsel, cryptography, service-identity, pyOpenSSL, itemloaders, Scrapy
#0 28.74   Running setup.py install for lxml: started
#0 29.05   Running setup.py install for lxml: finished with status 'error'
#0 29.06   error: subprocess-exited-with-error
#0 29.06
#0 29.06   × Running setup.py install for lxml did not run successfully.
#0 29.06   │ exit code: 1
#0 29.06   ╰─> [87 lines of output]
#0 29.06       Building lxml version 4.6.3.
#0 29.06       Building without Cython.
#0 29.06       Building against libxml2 2.9.10 and libxslt 1.1.34
#0 29.06       running install
#0 29.06       /usr/local/lib/python3.11/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
#0 29.06         warnings.warn(
#0 29.06       running build
#0 29.06       running build_py
#0 29.06       creating build
#0 29.06       creating build/lib.linux-x86_64-cpython-311
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/sax.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/doctestcompare.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/pyclasslookup.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/ElementInclude.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/cssselect.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/_elementpath.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/builder.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/usedoctest.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/_setmixin.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/_diffcommand.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/diff.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/formfill.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/clean.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/_html5builder.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/soupparser.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/builder.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/defs.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/usedoctest.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/ElementSoup.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       copying src/lxml/html/html5parser.py -> build/lib.linux-x86_64-cpython-311/lxml/html
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron
#0 29.06       copying src/lxml/isoschematron/__init__.py -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron
#0 29.06       copying src/lxml/etree.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/etree_api.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/lxml.etree.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/lxml.etree_api.h -> build/lib.linux-x86_64-cpython-311/lxml
#0 29.06       copying src/lxml/includes/__init__.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/uri.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/config.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/xslt.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/tree.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/schematron.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/xpath.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/c14n.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/relaxng.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/xinclude.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/etree_defs.h -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       copying src/lxml/includes/lxml-version.h -> build/lib.linux-x86_64-cpython-311/lxml/includes
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/rng
#0 29.06       copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/rng
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl
#0 29.06       copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl
#0 29.06       copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl
#0 29.06       creating build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-x86_64-cpython-311/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
#0 29.06       running build_ext
#0 29.06       building 'lxml.etree' extension
#0 29.06       creating build/temp.linux-x86_64-cpython-311
#0 29.06       creating build/temp.linux-x86_64-cpython-311/src
#0 29.06       creating build/temp.linux-x86_64-cpython-311/src/lxml
#0 29.06       gcc -pthread -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DCYTHON_CLINE_IN_TRACEBACK=0 -I/usr/include/libxml2 -Isrc -Isrc/lxml/includes -I/usr/local/include/python3.11 -c src/lxml/etree.c -o build/temp.linux-x86_64-cpython-311/src/lxml/etree.o -w
#0 29.06       src/lxml/etree.c:289:12: fatal error: longintrepr.h: No such file or directory
#0 29.06         289 |   #include "longintrepr.h"
#0 29.06             |            ^~~~~~~~~~~~~~~
#0 29.06       compilation terminated.
#0 29.06       Compile failed: command '/usr/bin/gcc' failed with exit code 1
#0 29.06       cc -I/usr/include/libxml2 -I/usr/include/libxml2 -c /tmp/xmlXPathInit1r818fy1.c -o tmp/xmlXPathInit1r818fy1.o
#0 29.06       cc tmp/xmlXPathInit1r818fy1.o -lxml2 -o a.out
#0 29.06       error: command '/usr/bin/gcc' failed with exit code 1
#0 29.06       [end of output]
#0 29.06
#0 29.06   note: This error originates from a subprocess, and is likely not a problem with pip.
#0 29.06 error: legacy-install-failure
#0 29.06
#0 29.06 × Encountered error while trying to install package.
#0 29.06 ╰─> lxml
#0 29.06
#0 29.06 note: This is an issue with the package mentioned above, not pip.
#0 29.06 hint: See above for output from the failure.
#0 29.48
#0 29.48 [notice] A new release of pip available: 22.3.1 -> 23.0.1
#0 29.48 [notice] To update, run: pip install --upgrade pip
------
failed to solve: executor failed running [/bin/sh -c pip install -r requirements.txt -i https://mirrors.ustc.edu.cn/pypi/web/simple/]: exit code: 1

Cannot download the 3D models

Thank you for the great work!

But I have some problems downloading the 3D models. Specifically, I configure the docker and run the corresponding commands, and scrapy gives me these responces:

$ sudo bash ./run.sh
Creating network "vroid-dataset_default" with the default driver
Creating scrapy ... done
Attaching to scrapy
scrapy    | 
scrapy    | 
scrapy    | ==========  Initializing Spider ==========
scrapy    | Loading Model List From File: _data/lustrous/raw/vroid/metadata.json
scrapy    | Loaded 16029 Models From File
scrapy    | Max Model Count:  20000
scrapy    | Max Duplicate Models Found:  50
scrapy    | Spider Initialized................
scrapy    | 
scrapy    | 
scrapy    | 
scrapy    | 
scrapy    | 
scrapy    | ========== Finished Crawling ==========
scrapy    | 
scrapy    | 
scrapy    | Models Collected: 
scrapy    | 16029
scrapy exited with code 0
Removing scrapy ... done
Removing network vroid-dataset_default

However, I cannot find any downloaded 3d model in the current directory, and only find two new folders:

  1. data folder which contains a json file datmodels.json
  2. files folder which is empty

And it seems that I also have no additional files except for the metadata.json in ./_data/lustrous/raw/vroid.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.