-
Java 8+
-
Python 3.8+
-
Infer : version 1.1.0 Infer Static Analyzer | Infer | Infer (fbinfer.com)
-
Ollama ollama/ollama (github.com)
Install the required model in Ollama
ollama pull codellama
required file:
prompt path
│ problem.txt - the problem used to prompt the LLM
Steps:
-
Run the Python script with the init mode. The argument
--prompt_path
should be the directory of your problem.txtpython main.py --prompt_path codebase/example --mode init
After this command the script will create a completion.txt
file in the prompt_path directory which includes the completion of the LLM
-
Extract the code from the
completion.txt
and save it intoSolution.java
file in the prompt_path directory to allow Infer to analyze. -
Run the command to let infer to analyze the
Solution.java
infer run -- javac codebase/example/Solution.java
-
If the Infer provides the errors , then organize the errors and put them into a file called
infer_output.txt
int the prompt_path directory.
required file:
prompt path
│ problem.txt - the problem used to prompt the LLM
│ infer_output.txt - the errors detected by infer
│ Solution.java - the java completion from the previous iteration
Steps:
-
Run the Python script with the iter mode. The argument
--prompt_path
should be the directory of your problem.txtpython main.py --prompt_path codebase/example --mode iter
After this command the script will create a completion.txt
file in the prompt_path directory which includes the completion of the LLM
-
Extract the code from the
completion.txt
and save it intoSolution.java
file in the prompt_path directory to allow Infer to analyze. -
Run the command to let infer to analyze the
Solution.java
infer run -- javac codebase/example/Solution.java
-
If the Infer provides the errors , then organize the errors and put them into a file called
infer_output.txt
int the prompt_path directory.