Giter VIP home page Giter VIP logo

Comments (29)

ellnix avatar ellnix commented on June 2, 2024

This should not be difficult to implement but I can predict a data race occurring:

Say I have document "constitution" with field "first_line":

  • User changes "first_line" to "All puppies are created equal" which timeouts and will retry in 30 seconds
  • User changes "first_line" to "All men are created equal" and it works
  • Background job changes "first_line" to "All puppies are created equal"

Since retry_on seems to be a class method on ActiveJob::Base you should be able to configure it for yourself:

# config/initializers/meilisearch.rb
MeiliSearch::Rails::MSJob.retry_on MeiliSearch::TimeoutError, wait: 30.seconds, attempts: :unlimited

from meilisearch-rails.

drale2k avatar drale2k commented on June 2, 2024

That's a very valid issue. Is there any guidance what to do if Meilisearch Cloud is not available or returning timeouts? It would be really helpful to have a retry mechanism built in which discards updates that are out of date (as in your example). Otherwise it's basically retry and pray

from meilisearch-rails.

ellnix avatar ellnix commented on June 2, 2024

I did discuss it with Bruno in the past and he agreed that it would be a good feature to be able to opt into: #187 (comment)

However I prioritized refactoring the codebase before adding new features and unfortunately haven't had time to finish it.

from meilisearch-rails.

drale2k avatar drale2k commented on June 2, 2024

I keep running into timeout issues constantly. This time when trying to reindex a model with 7 records

Is there anything i can do about this? This is really breaking for me


Episode.reindex!
(irb):1:in `<main>': The request was not processed in the expected time. Failed to open TCP connection to ms-d***.fra.meilisearch.io:443 (execution expired) (MeiliSearch::TimeoutError)
/Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:1603:in `initialize': Failed to open TCP connection to ms-d***.fra.meilisearch.io:443 (execution expired) (Net::OpenTimeout)
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:1603:in `open'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:1603:in `block in connect'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/timeout.rb:186:in `block in timeout'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/timeout.rb:193:in `timeout'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:1601:in `connect'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:1580:in `do_start'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:1569:in `start'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/http.rb:2297:in `request'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/httparty-0.21.0/lib/httparty/request.rb:156:in `perform'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/httparty-0.21.0/lib/httparty.rb:612:in `perform_request'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/httparty-0.21.0/lib/httparty.rb:526:in `get'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/meilisearch-0.26.0/lib/meilisearch/http_request.rb:27:in `block in http_get'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/meilisearch-0.26.0/lib/meilisearch/http_request.rb:109:in `send_request'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/meilisearch-0.26.0/lib/meilisearch/http_request.rb:26:in `http_get'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/gems/meilisearch-0.26.0/lib/meilisearch/index.rb:286:in `settings'
	from /Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/gems/3.3.0/bundler/gems/meilisearch-rails-52526a735c73/lib/meilisearch-rails.rb:308:in `block in settings'

It worked when running it a second time

from meilisearch-rails.

brunoocasali avatar brunoocasali commented on June 2, 2024

Hey @drale2k, can you run this command ulimit -a and send me the results?

from meilisearch-rails.

brunoocasali avatar brunoocasali commented on June 2, 2024

Also, can you try setting a different value on the timeout:

MeiliSearch::Rails.configuration = {
  meilisearch_url: '...',
  meilisearch_api_key: '...',
  timeout: 5, # default is: 1
  max_retries: 3 # default is: 0
}

from meilisearch-rails.

drale2k avatar drale2k commented on June 2, 2024

I am on a Macbook Pro M1

ulimit -a
-t: cpu time (seconds)              unlimited
-f: file size (blocks)              unlimited
-d: data seg size (kbytes)          unlimited
-s: stack size (kbytes)             8176
-c: core file size (blocks)         0
-v: address space (kbytes)          unlimited
-l: locked-in-memory size (kbytes)  unlimited
-u: processes                       10666
-n: file descriptors                256

I used the recommended setting you posted above for 2 days now and just now as i was writing this comment, i executed a couple of reindexes on my model which has about 7 records.

The first 4 ran fine, the 5th timed out again

 Episode.reindex!
(irb):10:in `<main>': The request was not processed in the expected time. Net::ReadTimeout with #<TCPSocket:(closed)> (MeiliSearch::TimeoutError)
/Users/drale2k/.rbenv/versions/3.3.1/lib/ruby/3.3.0/net/protocol.rb:229:in `rbuf_fill': Net::ReadTimeout with #<TCPSocket:(closed)> (Net::ReadTimeout)

Trying to replicate i ran 20 reindexes twice. The last request (the 40th) timed out again

irb(main):062* 20.times do
irb(main):063*   Episode.reindex!
irb(main):064> end

from meilisearch-rails.

brunoocasali avatar brunoocasali commented on June 2, 2024

Thanks for the ulimit info, and I think I have an idea why I can't reproduce your case:

# this is my Macbook Pro M1:
ulimit -a
-t: cpu time (seconds)              unlimited
-f: file size (blocks)              unlimited
-d: data seg size (kbytes)          unlimited
-s: stack size (kbytes)             8176
-c: core file size (blocks)         0
-v: address space (kbytes)          unlimited
-l: locked-in-memory size (kbytes)  unlimited
-u: processes                       5333
-n: file descriptors                2560

Each network connection uses a file descriptor (I don't know if that's a Net ruby thing). But when you make multiple requests simultaneously or in quick succession (like in a multi-threaded scenario), you may exhaust the available file descriptors allowed for your process.

Also, I can see you have double of user processes than me, so it could mean that you are probably using this space with other processes that are consuming your file descriptors limit.

So, can you try updating this limit?

ulimit -n 2048 # Sets the limit to 2048 file descriptors

from meilisearch-rails.

brunoocasali avatar brunoocasali commented on June 2, 2024

Hi @drale2k did you have the time to do those changes?

from meilisearch-rails.

drale2k avatar drale2k commented on June 2, 2024

Hi, yes i changed it since then but had to move on to other features and did not test in high enough volume. I am about to deploy to production next week and will report on how it went there. Sorry for the late reply

from meilisearch-rails.

brunoocasali avatar brunoocasali commented on June 2, 2024

No problem, @drale2k. Let me know when you have any news! Good luck! 🙏

from meilisearch-rails.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.