gair-nlp / factool Goto Github PK
View Code? Open in Web Editor NEWFacTool: Factuality Detection in Generative AI
Home Page: https://ethanc111.github.io/factool_website/
License: Apache License 2.0
FacTool: Factuality Detection in Generative AI
Home Page: https://ethanc111.github.io/factool_website/
License: Apache License 2.0
I'm looking for a way to detect fake news articles.
As in title. It's also present in the article.
Above issue recently found
How are the acc, precision, recall, and other indicators of claim level calculated? Because the claims extracted from the model are definitely different? It is challenging to calculate indicators without manual evaluation.
Is the calculation of the indicators here based on the ground truth annotated in the dataset? The default claim is given, and the correctness of each claim is known, corresponding to the label in the dataset. Collecting evidence based on the given claim, and then verify the correctness of the claim and its consistency with the ground truth, in order to conduct factual verification. I don't know if my understanding is correct. Isn't this part of the code missing in the repo?
Hello,
I'm experiencing an error after following the installation guidelines for Factool and using the code from example.py
. Despite trying both recommended installation methods and ensuring all three API keys are correctly defined, the tool is not functioning as expected. Could you please assist in resolving this issue?
I cannot find any related sources...
Hello, I encountered the following error.
0
1 retry left...
Traceback (most recent call last):
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\connector.py", line 980, in _wrap_create_connection
return await self._loop.create_connection(*args, **kwargs) # type: ignore[return-value] # noqa
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\base_events.py", line 1070, in create_connection
raise exceptions[0]
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\base_events.py", line 1054, in create_connection
sock = await self._connect_sock(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\base_events.py", line 963, in _connect_sock
await self.sock_connect(sock, address)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\proactor_events.py", line 709, in sock_connect
return await self._proactor.connect(sock, address)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\windows_events.py", line 821, in _poll
value = callback(transferred, key, ov)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\windows_events.py", line 608, in finish_connect
ov.getresult()
OSError: [WinError 121] 信号灯超时时间已到
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\openai\api_requestor.py", line 672, in arequest_raw
result = await session.request(**request_kwargs)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\client.py", line 536, in _request
conn = await self._connector.connect(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\connector.py", line 540, in connect
proto = await self._create_connection(req, traces, timeout)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\connector.py", line 901, in _create_connection
_, proto = await self._create_direct_connection(req, traces, timeout)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\connector.py", line 1206, in _create_direct_connection
raise last_exc
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\connector.py", line 1175, in _create_direct_connection
transp, proto = await self._wrap_create_connection(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\aiohttp\connector.py", line 988, in _wrap_create_connection
raise client_error(req.connection_key, exc) from exc
aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host api.openai.com:443 ssl:default [信号灯超时时间已到]
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "c:\Users\FanYaxin\OneDrive\桌面\factool_test.py", line 23, in <module>
response_list = factool_instance.run(inputs)
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\factool.py", line 55, in run
batch_results = asyncio.run(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\asyncio\base_events.py", line 649, in run_until_complete
return future.result()
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\knowledge_qa\pipeline.py", line 100, in run_with_tool_api_call
claims_in_responses, queries_in_responses, evidences_in_responses, verifications_in_responses = await self.run_with_tool_live(responses[batch_start:batch_end])
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\knowledge_qa\pipeline.py", line 63, in run_with_tool_live
claims_in_responses = await self._claim_extraction(responses)
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\knowledge_qa\pipeline.py", line 38, in _claim_extraction
return await self.chat.async_run(messages_list, List)
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\utils\openai_wrapper.py", line 109, in async_run
predictions = await self.dispatch_openai_requests(
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\utils\openai_wrapper.py", line 96, in dispatch_openai_requests
return await asyncio.gather(*async_responses)
File "c:\users\fanyaxin\onedrive\文档\vscode\2023\factool\factool\utils\openai_wrapper.py", line 66, in _request_with_retry
response = await openai.ChatCompletion.acreate(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\openai\api_resources\chat_completion.py", line 45, in acreate
return await super().acreate(*args, **kwargs)
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 217, in acreate
response, _, api_key = await requestor.arequest(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\openai\api_requestor.py", line 372, in arequest
result = await self.arequest_raw(
File "C:\Users\FanYaxin\AppData\Local\anaconda3\lib\site-packages\openai\api_requestor.py", line 689, in arequest_raw
raise error.APIConnectionError("Error communicating with OpenAI") from e
openai.error.APIConnectionError: Error communicating with OpenAI
I run the following code and it's ok.
import openai
messages = [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"}]
response = openai.ChatCompletion.create(model='gpt-3.5-turbo', messages=messages, max_tokens=2000, temperature=0.5)
print(response)
I also tried the following solutions and they all failed.
https://zhuanlan.zhihu.com/p/611080662
https://blog.csdn.net/weixin_43937790/article/details/131121974
So, how to solve this problem?
Is that 0-shot and 3-shot CoT?
Judging whether a text is factual or not can be challenging when the provided text is ambiguous. We should let users define or choose their own standards.
It would be great to know the cost estimate prior to running the whole workload
Hi! I'm looking for the code you used for calculating ROUGE and BertScore on the RoSE dataset. Is it available by chance?
Thanks!
Current implementation of env variables in gradio should be improved (in app.py) in this PR: c920336
def fact_check(openai_api_key, serper_api_key, scraper_api_key, model, message, response, category):
os.environ['SCRAPER_API_KEY'] = ''"
If I run the exmaple code for KBQA in a jupyter notebook I get the following error message:
"RuntimeError: asyncio.run() cannot be called from a running event loop"
Any idea what the issue might be / how to fix it? Thank!
---- (executed code) -----
from factool import Factool
factool_instance = Factool("gpt-4")
inputs = [
{
"prompt": "Introduce Graham Neubig",
"response": "Graham Neubig is a professor at MIT",
"category": "kbqa"
},
]
response_list = factool_instance.run(inputs)
print(response_list)
Hi there!
Looks like a fantastic project!
Was wondering how do I add a text file like Manifest (or a folder) and have the get a factool based on (only) information in those files.
Like for example I want to ask it questions based on my notes.
Or ask questions based on a transcript of a YouTube video and get answer based on what is claimed ,
Or answer questions based on a pre written FAQ
Ultimately it would be nice to easily toggle which dataset to use, open, local, or both to cross correlate, and to go back and forth depending on what is desired
Let me know how I can do this currently with the current state of the code..
And hopefully set easy flag parameters and source store soon
Thanks a lot and all the best!
https://github.com/GAIR-NLP/factool/blob/main/factool/factool.py#L106
For code responses, line 106 might fail.
So far, we can integrate factool into chatgpt through chatgpt plugin, this is good. It would be better if we can
make factool being easily integrated into other opensourced chatbots.
Specifically, we may need to consider two following situations:
(1) how to introduce factool into chatbots without plugin functionality
(2) how to introduce factool into plugin-enabled chatbots
Why is this error reported,i set the scraperapi='' when evaluate kbqa
requests.exceptions.ConnectTimeout: HTTPConnectionPool(host='api.scraperapi.com', port=80): Max retries exceeded with url: /account?api_key=%27%27 (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f33556f3490>, 'Connection to api.scraperapi.com timed out. (connect timeout=None)'))
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.