Comments (4)
Thank you for catching this! This is indeed an oversight on our end.
Currently, we are observing minor improvements in both the Llama3-8B and Llama3-70B models, which do not affect their positions on the leaderboard. We will verify the results and address this issue in the next release by updating the handler and adjusting the leaderboard scores accordingly. This update will ensure that the Llama handler's prompt template aligns with the official one.
from gorilla.
No problem! Awesome -- thanks for your great work on this benchmark.
I have another (mostly unrelated) question as well -- are the benchmark numbers reported when running with temperature=0? I was unable to reproduce the scores for Llama-3-8B-Instruct and noticed the default temperature setting was 0.7.
from gorilla.
No problem! Awesome -- thanks for your great work on this benchmark.
I have another (mostly unrelated) question as well -- are the benchmark numbers reported when running with temperature=0? I was unable to reproduce the scores for Llama-3-8B-Instruct and noticed the default temperature setting was 0.7.
We are using temperature = 0.7
for reported llama3 numbers. If you would like to provide more information on what is different, we can investigate why you are not able to reproduce.
from gorilla.
If we are sampling but not seeding, then results will naturally not be deterministic. There is some unavoidable non-determinism that vLLM introduces, but I imagine most of the discrepancy stems from sampling with non-zero temperature.
from gorilla.
Related Issues (20)
- [RAFT] Publish Pypi package with raft, eval and format scripts
- [Apibench] Resume interrupted LLM generations from last generation
- [BFCL] Get rid of legacy naming convention for LLM generated files
- [BFCL] Sanity check should be optional and by default off HOT 2
- [bug] OpenFunctions-v2: how to continue conversation? HOT 1
- [BFCL] Inconsistency in leaderboard scores HOT 2
- Question about AST evaluation for Java HOT 3
- Java/Javascript Scores HOT 1
- LeaderBoard data generation HOT 1
- Set Model Temperature to 0 for Consistent Leaderboard Results HOT 1
- BFCL setup instruction is very difficult to follow
- Clarify Documentation About Running The Benchmark
- Single Source of Truth
- Questions about the evaluation criteria. HOT 3
- [Apibench] No module named 'tree_sitter_java' HOT 2
- Evaluation using vLLM and other tools
- Test data error in executable parallel multiple function HOT 2
- distutils.errors.CompileError: command '/usr/bin/cc' failed with exit code 1
- [bug] Hosted Gorilla: <Issue> HOT 1
- LangChain Integration of Gorilla OpenFunctions-v2 HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from gorilla.