Giter VIP home page Giter VIP logo

ollama4j's Issues

`evalCount` in response from ollama cannot be parsed

I sometimes get the error

UnrecognizedPropertyException: Unrecognized field "eval_count" (class io.github.amithkoujalgi.ollama4j.core.models.OllamaResponseModel), not marked as ignorable (11 known properties: "response", "done", "evalCount", "eval_duration", "model", "created_at", "prompt_eval_duration", "load_duration", "context", "prompt_eval_count", "total_duration"])

Unrecognized field "done_reason"

After the latest update of the Ollama client (version: 6.2.2, May 12, 2024), I encounter the following exceptions:

ERROR io.github.amithkoujalgi.ollama4j.core.models.request.OllamaChatEndpointCaller - Error parsing the Ollama chat response!
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "done_reason" (class io.github.amithkoujalgi.ollama4j.core.models.chat.OllamaChatResponseModel), not marked as ignorable (12 known properties: "done", "message", "error", "model", "created_at", "prompt_eval_duration", "load_duration", "context", "eval_duration", "eval_count", "total_duration", "prompt_eval_count"])
at [Source: REDACTED (StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION disabled); line: 1, column: 142] (through reference chain: io.github.amithkoujalgi.ollama4j.core.models.chat.OllamaChatResponseModel["done_reason"])
at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61)
at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:1153)
at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:2241)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1793)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1771)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:316)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:342)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4905)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3848)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3816)
at io.github.amithkoujalgi.ollama4j.core.models.request.OllamaChatEndpointCaller.parseResponseAndAddToBuffer(OllamaChatEndpointCaller.java:40)
at io.github.amithkoujalgi.ollama4j.core.models.request.OllamaEndpointCaller.callSync(OllamaEndpointCaller.java:99)
at io.github.amithkoujalgi.ollama4j.core.OllamaAPI.chat(OllamaAPI.java:521)
at io.github.amithkoujalgi.ollama4j.core.OllamaAPI.chat(OllamaAPI.java:498)

Certain requests fail with a 400 Bad Request

It appears that certain requests generated by the ask() method trigger a 400 - Bad request on the side of ollama.

Trying the same request manually works though (i.e. copying the JSON generated by ollamaRequestModel.toString() and posting it manually).

Unrecognized field "expires_at"

Ran into this error when using Ollama4j 1.0.63 against an Ollama 0.1.38 calling listModels():

Exception: Unrecognized field "expires_at" (class io.github.amithkoujalgi.ollama4j.core.models.Model), not marked as ignorable (6 known properties: "size", "details", "digest", "model", "name", "modified_at"])
at [Source: REDACTED (StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION disabled); line: 1, column: 370] (through reference chain: io.github.amithkoujalgi.ollama4j.core.models.ListModelsResponse["models"]->java.util.ArrayList[0]->io.github.amithkoujalgi.ollama4j.core.models.Model["expires_at"])

Worked fine against 0.1.34 and bumping ollama4j to 1.0.72 resolved the issue.
Perhaps improve error message if Ollama changes the format repeatedly in the future.
Closing and just adding for reference.

Thanks for an excellent wrapper library btw :)

Change your logo with Duke instead

Friendly suggestion that it would be better to use the Java Duke logo instead of the TM Java logo. I speak out of experience :) Otherwise very interesting project, keep it up!
IMG_5744

Extend generate API Requests by advanced parameters

In addition to the /api/chat endpoint, the system prompt parameter (and other parameters that override model behaviours) can also be provided to requests at /api/generate.

Thus these should also be made available for the ollama4j API.

See:
Advanced parameters (optional):

format: the format to return a response in. Currently the only accepted value is json

system: system message to (overrides what is defined in the Modelfile)

template: the prompt template to use (overrides what is defined in the Modelfile)

context: the context parameter returned from a previous request to /generate, this can be used to keep a short conversational memory

raw: if true no formatting will be applied to the prompt. You may choose to use the raw parameter if you are specifying a full templated prompt in your request to the API

keep_alive: controls how long the model will stay loaded into memory following the request (default: 5m)

Originally posted by @AgentSchmecker in #20 (comment)

Logging backend should not be a transitive dependency

Ollama4j uses logback as a logging backend and has a transitive dependency on it.

Libraries should not have transitive dependencies on logging backends.

If you need the backend during testing, best use the test scope on the dependency.

In general, the application which uses the ollama4j library should define a logging backend for ollama4j to use.

com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `java.time.LocalDateTime` from String "2024-05-08T20:06:14.620804598-07:00": Failed to deserialize java.time.LocalDateTime: (java.time.format.DateTimeParseException) Text '2024-05-08T20:06:14.620804598-07:00' could not be parsed, unparsed text found at index 29

Looks like there's a regression with the latest version. See the deserialization exception that occurs when OllamaAPI.listModels is called in version 1.0.70. Note, this exception isn't thrown in 1.0.65

Exception in thread "main" com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `java.time.LocalDateTime` from String "2024-05-08T20:06:14.620804598-07:00": Failed to deserialize java.time.LocalDateTime: (java.time.format.DateTimeParseException) Text '2024-05-08T20:06:14.620804598-07:00' could not be parsed, unparsed text found at index 29
 at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 74] (through reference chain: io.github.amithkoujalgi.ollama4j.core.models.ListModelsResponse["models"]->java.util.ArrayList[0]->io.github.amithkoujalgi.ollama4j.core.models.Model["modified_at"])
	at com.fasterxml.jackson.databind.exc.InvalidFormatException.from(InvalidFormatException.java:67)
	at com.fasterxml.jackson.databind.DeserializationContext.weirdStringException(DeserializationContext.java:1958)
	at com.fasterxml.jackson.databind.DeserializationContext.handleWeirdStringValue(DeserializationContext.java:1245)
	at com.fasterxml.jackson.datatype.jsr310.deser.JSR310DeserializerBase._handleDateTimeException(JSR310DeserializerBase.java:176)
	at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer._fromString(LocalDateTimeDeserializer.java:216)
	at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:114)
	at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:41)
	at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:310)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
	at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer._deserializeFromArray(CollectionDeserializer.java:361)
	at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:246)
	at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:30)
	at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:310)
	at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:177)
	at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:342)
	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4905)
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3848)
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3816)
	at io.github.amithkoujalgi.ollama4j.core.OllamaAPI.listModels(OllamaAPI.java:137)

Empty Response when ollama is very slow

Issue

I get an empty response from the library, but my tcpflow logs show that the response does come later.

{
    "role" : "assistant",
    "content" : "",
    "images" : null
  }

Details

I have an old laptop with only Dual-Core i5 2.7Ghz and am running llama3 on Ollama.
I created a simple app to test ollama4j with llama3, but getting empty responses back, even though I set a huge request timeout.

Here is my code

String host = "http://localhost:11434/";
      String model = "llama3";
      OllamaAPI ollamaAPI = new OllamaAPI(host);
      ollamaAPI.setRequestTimeoutSeconds(600000);
      ollamaAPI.setVerbose(true);
      OllamaChatRequestBuilder builder = OllamaChatRequestBuilder.getInstance(model);
      Options options =
          new OptionsBuilder()
              .setTemperature(0.2f)
              .setNumCtx(example.getModel().getCompletionLength())
              .setTopK(1)
              .setTopP(0.9F)
              .build();
      OllamaChatRequestModel requestModel = builder.withMessage(OllamaChatMessageRole.SYSTEM, example.getSystemPrompt())
          .withMessage(OllamaChatMessageRole.USER, "What can you help me with?")
          .withOptions(options)
          .build();
      System.out.println("Ollama request: " + requestModel.toString());
      OllamaChatResult chatResult = ollamaAPI.chat(requestModel);
      System.out.println("Ollama answer: " + chatResult.getHttpStatusCode() + " in seconds: " + chatResult.getResponseTime() + ":\n" + chatResult.getResponse());

And my logs show this:

Ollama request: {
  "model" : "llama3",
  "options" : {
    "top_p" : 0.9,
    "top_k" : 1,
    "temperature" : 0.2,
    "num_ctx" : 1024
  },
  "stream" : false,
  "messages" : [ {
    "role" : "system",
    "content" : "You are a helpful customer service representative for a credit card company who helps answer customer questions about their past transactions and spending history. Today's date is January 18th, 2024. You provide precise answers and use functions to look up information...",
    "images" : null
  }, {
    "role" : "user",
    "content" : "What can you help me with?",
    "images" : null
  } ]
}
Ollama answer: 200 in seconds: 108372:

If I look into the message history, I basically see

{
    "role" : "assistant",
    "content" : "",
    "images" : null
  }

But if I check my tcpflow logs, I can see a Response:

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Date: Tue, 14 May 2024 10:29:15 GMT
Content-Length: 801

{
   "model":"llama3",
   "created_at":"2024-05-14T10:29:15.284946Z",
   "message":{
      "role":"assistant",
      "content":"I'm happy to assist you with any questions or concerns you may have about your credit card account. I can help you:\n\n* Review your transaction history and spending habits\n* Check your available credit limit and current balance\n* Provide information on rewards and benefits associated with your card\n* Help you track your spending by category (e.g., groceries, entertainment, etc.)\n* Offer suggestions for managing your debt or improving your financial situation\n\nWhat specific area would you like me to help you with today?"
   },
   "done_reason":"stop",
   "done":true,
   "total_duration":54914791839,
   "load_duration":15113419,
   "prompt_eval_duration":804784000,
   "eval_count":101,
   "eval_duration":54082687000
}

Trying the streaming API works, but not the synchronous one.
Do you have any idea what the problem is?

Basic Auth or Authorization Header Bearer Token

Do you have plans to support

  • Either Basic Auth
  • Or an Authorization Header Bearer Token

?

I understand Ollama does not support this, but when using a ReverseProxy in front of Ollama, then this would help a lot re security :-)

Thanks

Michael

Set options / temperature

According to the Ollama documentation one can set the temperature using options

https://github.com/jmorganca/ollama/blob/main/docs/api.md#parameters
https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values

{ "model": "mistral", "stream":false, "prompt":"How many moons does earth have?", "options":{"temperature":0} }

IIUC one cannot set the temperature / options yet using ollama4j.

I would like to suggest to introduce another "ask" method, where one can set options as well

Options options = new Options();
options.setTemperature(0.7);
ollamaAPI.ask(String, String, Options)

WDYT?

Thanks

Michael

Update Code Due to Changes in generate Method of PromptBuilder

Description

The generate method of PromptBuilder has undergone changes due to recent updates. Consequently, the code provided in the example no longer functions correctly.

Previously, the method was called as follows:

OllamaResult response = ollamaAPI.generate(model, promptBuilder.build());

However, in the latest version, additional options need to be passed to the generate method. Therefore, the code needs to be modified as follows:

OllamaResult response = ollamaAPI.generate(model, promptBuilder.build(), new Options());

Updating the code to reflect this change should prevent any further compatibility issues.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.