vrcmarcos / elasticmock Goto Github PK
View Code? Open in Web Editor NEWPython Elasticsearch Mock for test purposes
Home Page: https://pypi.python.org/pypi/ElasticMock
License: MIT License
Python Elasticsearch Mock for test purposes
Home Page: https://pypi.python.org/pypi/ElasticMock
License: MIT License
Hi,
Many query params don't work because they are not mentioned in the implemented code. kindly try to think of including these parameters, especially in the "search" method !
When using the mock for index method, we have to specifically give the doc type even tough it has been set to default value _doc in elastic search 7.x
The mock should not have a positional mandatory argument for doc_type in the mock method for index
I used this library to make a mock data.
And tried to search that mock data by using search module of elasticsearch library.
But it didn't find any.
Is it expected?
Thanks
I'm using elasticsearch 6.1.1, I'm aware that elastickmock needs elasticsearch==1.9.0
But I tried to use it anyway, it seems it is not able to even mock properly, my tests are still instantiating the Elasticsearch client.
I would suspect the patch changed from 'elasticsearch.Elasticsearch', but to use the client I still import like:
from elasticsearch import Elasticsearch
I'm using python 3.6
dateutil.parser.isoparse is meant to be used to transfrom ISO 8601 datetime strings into datetime objects. However, it is being used in FakeQueryCondition to parse a datetime object, which is both redundant and erroneous.
See https://github.com/vrcmarcos/elasticmock/blob/master/elasticmock/fake_elasticsearch.py#L179-L180
Proposal: Use a try/except block to try to parse the value. If exception on isoparse(), pass and continue.
Hello,
Thanks for the work you've done.
I'm using elasticsearch
and elasticsearch-dsl
and django-elasticsearch-dsl
packages.
I have documents defined. Documents are updated with signals.
Any idea why your decorator does not work?
See the code listing and trace below.
Thanks in advance.
from django_elasticsearch_dsl import DocType, Index, fields
class Car(models.Model):
name = models.CharField()
color = models.CharField()
description = models.TextField()
type = models.IntegerField(choices=[
(1, "Sedan"),
(2, "Truck"),
(4, "SUV"),
])
from django_elasticsearch_dsl import DocType, Index
from .models import Car
# Name of the Elasticsearch index
car = Index('cars')
# See Elasticsearch Indices API reference for available settings
car.settings(
number_of_shards=1,
number_of_replicas=0
)
@car.doc_type
class CarDocument(DocType):
class Meta:
model = Car # The model associate with this DocType
fields = [
'name',
'color',
'description',
'type',
] # the fields of the model you want to be indexed in Elasticsearch
from elasticmock import elasticmock
from django.test import TransactionTestCase
class ElasticTestCase(TransactionTestCase):
class CarTestCase(TransactionTestCase):
@elasticmock
def setUp(self):
self.car = Car(name='Name', color='Blue', description='Description', type=1)
self.car.save()
def test_car(self):
pass
=================================== FAILURES ===================================
__________________ CarTestCase.test_car ___________________
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
def _new_conn(self):
""" Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw['source_address'] = self.source_address
if self.socket_options:
extra_kw['socket_options'] = self.socket_options
try:
conn = connection.create_connection(
> (self.host, self.port), self.timeout, **extra_kw)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py:141:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('localhost', 9200), timeout = 10, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None, socket_options=None):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith('['):
host = host.strip('[]')
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
af, socktype, proto, canonname, sa = res
sock = None
try:
sock = socket.socket(af, socktype, proto)
# If provided, set socket level options before connecting.
_set_socket_options(sock, socket_options)
if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
sock.settimeout(timeout)
if source_address:
sock.bind(source_address)
sock.connect(sa)
return sock
except socket.error as e:
err = e
if sock is not None:
sock.close()
sock = None
if err is not None:
> raise err
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py:83:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('localhost', 9200), timeout = 10, source_address = None
socket_options = [(6, 1, 1)]
def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None, socket_options=None):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith('['):
host = host.strip('[]')
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
# The original create_connection function always returns all records.
family = allowed_gai_family()
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
af, socktype, proto, canonname, sa = res
sock = None
try:
sock = socket.socket(af, socktype, proto)
# If provided, set socket level options before connecting.
_set_socket_options(sock, socket_options)
if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
sock.settimeout(timeout)
if source_address:
sock.bind(source_address)
> sock.connect(sa)
E ConnectionRefusedError: [Errno 111] Connection refused
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
During handling of the above exception, another exception occurred:
self = <Urllib3HttpConnection: http://localhost:9200>, method = 'POST'
url = '/_bulk?refresh=true', params = {'refresh': b'true'}
body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
timeout = None, ignore = ()
def perform_request(self, method, url, params=None, body=None, timeout=None, ignore=()):
url = self.url_prefix + url
if params:
url = '%s?%s' % (url, urlencode(params))
full_url = self.host + url
start = time.time()
try:
kw = {}
if timeout:
kw['timeout'] = timeout
# in python2 we need to make sure the url and method are not
# unicode. Otherwise the body will be decoded into unicode too and
# that will fail (#133, #201).
if not isinstance(url, str):
url = url.encode('utf-8')
if not isinstance(method, str):
method = method.encode('utf-8')
> response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py:78:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectiocarsenvol.HTTPConnectionPool object at 0x7fa304c3b390>
method = 'POST', url = '/_bulk?refresh=true'
body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
headers = {'connection': 'keep-alive'}
retries = Retry(total=False, connect=None, read=None, redirect=0)
redirect = True, assert_same_host = True
timeout = <object object at 0x7fa30de2e400>, pool_timeout = None
release_conn = True, chunked = False, body_pos = None, response_kw = {}
conn = None, release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x7fa304c3bcf8>
is_new_proxy_conn = False
def urlopen(self, method, url, body=None, headers=None, retries=None,
redirect=True, assert_same_host=True, timeout=_Default,
pool_timeout=None, release_conn=None, chunked=False,
body_pos=None, **response_kw):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param body:
Data to send in the request body (useful for creating
POST requests, see HTTPConnectionPool.post_url for
more convenience).
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When False, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get('preload_content', True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/shazow/urllib3/issues/651>
release_this_conn = release_conn
# Merge the proxy headers. Only do this in HTTP. We have to copy the
# headers dict so we can safely change it without those changes being
# reflected in anyone else's copy.
if self.scheme == 'http':
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(conn, 'sock', None)
if is_new_proxy_conn:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(conn, method, url,
timeout=timeout_obj,
body=body, headers=headers,
chunked=chunked)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw['request_method'] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw)
# Everything went great!
clean_exit = True
except queue.Empty:
# Timed out by queue.
raise EmptyPoolError(self, "No pool connections are available.")
except (BaseSSLError, CertificateError) as e:
# Close the connection. If a connection is reused on which there
# was a Certificate error, the next request will certainly raise
# another Certificate error.
clean_exit = False
raise SSLError(e)
except SSLError:
# Treat SSLError separately from BaseSSLError to preserve
# traceback.
clean_exit = False
raise
except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
# Discard the connection for these exceptions. It will be
# be replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError('Cannot connect to proxy.', e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError('Connection aborted.', e)
retries = retries.increment(method, url, error=e, _pool=self,
> _stacktrace=sys.exc_info()[2])
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py:649:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=False, connect=None, read=None, redirect=0), method = 'POST'
url = '/_bulk?refresh=true', response = None
error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>: Failed to establish a new connection: [Errno 111] Connection refused',)
_pool = <urllib3.connectiocarsenvol.HTTPConnectionPool object at 0x7fa304c3b390>
_stacktrace = <traceback object at 0x7fa304c3df48>
def increment(self, method=None, url=None, response=None, error=None,
_pool=None, _stacktrace=None):
""" Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
> raise six.reraise(type(error), error, _stacktrace)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/retry.py:324:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tp = <class 'urllib3.exceptions.NewConnectionError'>
value = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>: Failed to establish a new connection: [Errno 111] Connection refused',)
tb = <traceback object at 0x7fa304c3df48>
def reraise(tp, value, tb=None):
if value is None:
value = tp()
if value.__traceback__ is not tb:
raise value.with_traceback(tb)
> raise value
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/packages/six.py:686:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectiocarsenvol.HTTPConnectionPool object at 0x7fa304c3b390>
method = 'POST', url = '/_bulk?refresh=true'
body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
headers = {'connection': 'keep-alive'}
retries = Retry(total=False, connect=None, read=None, redirect=0)
redirect = True, assert_same_host = True
timeout = <object object at 0x7fa30de2e400>, pool_timeout = None
release_conn = True, chunked = False, body_pos = None, response_kw = {}
conn = None, release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x7fa304c3bcf8>
is_new_proxy_conn = False
def urlopen(self, method, url, body=None, headers=None, retries=None,
redirect=True, assert_same_host=True, timeout=_Default,
pool_timeout=None, release_conn=None, chunked=False,
body_pos=None, **response_kw):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param body:
Data to send in the request body (useful for creating
POST requests, see HTTPConnectionPool.post_url for
more convenience).
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When False, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get('preload_content', True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/shazow/urllib3/issues/651>
release_this_conn = release_conn
# Merge the proxy headers. Only do this in HTTP. We have to copy the
# headers dict so we can safely change it without those changes being
# reflected in anyone else's copy.
if self.scheme == 'http':
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(conn, 'sock', None)
if is_new_proxy_conn:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(conn, method, url,
timeout=timeout_obj,
body=body, headers=headers,
> chunked=chunked)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py:600:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectiocarsenvol.HTTPConnectionPool object at 0x7fa304c3b390>
conn = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
method = 'POST', url = '/_bulk?refresh=true'
timeout = <urllib3.util.timeout.Timeout object at 0x7fa304c3bcf8>
chunked = False
httplib_request_kw = {'body': b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n', 'headers': {'connection': 'keep-alive'}}
timeout_obj = <urllib3.util.timeout.Timeout object at 0x7fa304c3bcc0>
def _make_request(self, conn, method, url, timeout=_Default, chunked=False,
**httplib_request_kw):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls httplib.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
> conn.request(method, url, **httplib_request_kw)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py:356:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
method = 'POST', url = '/_bulk?refresh=true'
body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
headers = {'connection': 'keep-alive'}
def request(self, method, url, body=None, headers={}):
"""Send a complete request to the server."""
> self._send_request(method, url, body, headers)
/usr/lib/python3.5/http/client.py:1106:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
method = 'POST', url = '/_bulk?refresh=true'
body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
headers = {'connection': 'keep-alive'}
def _send_request(self, method, url, body, headers):
# Honor explicitly requested Host: and Accept-Encoding: headers.
header_names = dict.fromkeys([k.lower() for k in headers])
skips = {}
if 'host' in header_names:
skips['skip_host'] = 1
if 'accept-encoding' in header_names:
skips['skip_accept_encoding'] = 1
self.putrequest(method, url, **skips)
if 'content-length' not in header_names:
self._set_content_length(body, method)
for hdr, value in headers.items():
self.putheader(hdr, value)
if isinstance(body, str):
# RFC 2616 Section 3.7.1 says that text default has a
# default charset of iso-8859-1.
body = _encode(body, 'body')
> self.endheaders(body)
/usr/lib/python3.5/http/client.py:1151:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
message_body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
def endheaders(self, message_body=None):
"""Indicate that the last header line has been sent to the server.
This method sends the request to the server. The optional message_body
argument can be used to pass a message body associated with the
request. The message body will be sent in the same packet as the
message headers if it is a string, otherwise it is sent as a separate
packet.
"""
if self.__state == _CS_REQ_STARTED:
self.__state = _CS_REQ_SENT
else:
raise CannotSendHeader()
> self._send_output(message_body)
/usr/lib/python3.5/http/client.py:1102:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
message_body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"content": "my first car", "summary": "this ...er": "VPRO", "state": "invitation_sent", "archive": false, "genre": "sport", "channel": "NPO1", "title": "my car"}\n'
def _send_output(self, message_body=None):
"""Send the currently buffered request and clear the buffer.
Appends an extra \\r\\n to the buffer.
A message_body may be specified, to be appended to the request.
"""
self._buffer.extend((b"", b""))
msg = b"\r\n".join(self._buffer)
del self._buffer[:]
> self.send(msg)
/usr/lib/python3.5/http/client.py:934:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
data = b'POST /_bulk?refresh=true HTTP/1.1\r\nHost: localhost:9200\r\nAccept-Encoding: identity\r\nContent-Length: 293\r\nconnection: keep-alive\r\n\r\n'
def send(self, data):
"""Send `data' to the server.
``data`` can be a string object, a bytes object, an array object, a
file-like object that supports a .read() method, or an iterable object.
"""
if self.sock is None:
if self.auto_open:
> self.connect()
/usr/lib/python3.5/http/client.py:877:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
def connect(self):
> conn = self._new_conn()
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py:166:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>
def _new_conn(self):
""" Establish a socket connection and set nodelay settings on it.
:return: New socket connection.
"""
extra_kw = {}
if self.source_address:
extra_kw['source_address'] = self.source_address
if self.socket_options:
extra_kw['socket_options'] = self.socket_options
try:
conn = connection.create_connection(
(self.host, self.port), self.timeout, **extra_kw)
except SocketTimeout as e:
raise ConnectTimeoutError(
self, "Connection to %s timed out. (connect timeout=%s)" %
(self.host, self.timeout))
except SocketError as e:
raise NewConnectionError(
> self, "Failed to establish a new connection: %s" % e)
E urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>: Failed to establish a new connection: [Errno 111] Connection refused
../../.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py:150: NewConnectionError
During handling of the above exception, another exception occurred:
self = <cars.tests.CarTest testMethod=test_car>
@elasticmock
def setUp(self):
self.car = Car(name='Name', color='Blue', description='Description', type=1)
> self.car.save()
cars/tests/test_timeline.py:31:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
cars/models/car.py:86: in save
super(Car, self).save(*args, **kwargs)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/django/db/models/base.py:796: in save
force_update=force_update, update_fields=update_fields)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/django/db/models/base.py:833: in save_base
update_fields=update_fields, raw=raw, using=using)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/django/dispatch/dispatcher.py:191: in send
response = receiver(signal=self, sender=sender, **named)
../django-elasticsearch-dsl/django_elasticsearch_dsl/signals.py:10: in update_document
registry.update(instance)
../django-elasticsearch-dsl/django_elasticsearch_dsl/registries.py:29: in update
doc.update(instance, **kwargs)
../django-elasticsearch-dsl/django_elasticsearch_dsl/documents.py:165: in update
return self.bulk(actions, **kwargs)
../django-elasticsearch-dsl/django_elasticsearch_dsl/documents.py:147: in bulk
refresh=refresh, **kwargs)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/helpers/__init__.py:188: in bulk
for ok, item in streaming_bulk(client, actions, **kwargs):
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/helpers/__init__.py:160: in streaming_bulk
for result in _process_bulk_chunk(client, bulk_actions, raise_on_exception, raise_on_error, **kwargs):
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/helpers/__init__.py:89: in _process_bulk_chunk
raise e
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/helpers/__init__.py:85: in _process_bulk_chunk
resp = client.bulk('\n'.join(bulk_actions) + '\n', **kwargs)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/client/utils.py:69: in _wrapped
return func(*args, params=params, **kwargs)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/client/__init__.py:782: in bulk
doc_type, '_bulk'), params=params, body=self._bulk_body(body))
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/transport.py:307: in perform_request
status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <Urllib3HttpConnection: http://localhost:9200>, method = 'POST'
url = '/_bulk?refresh=true', params = {'refresh': b'true'}
body = b'{"index": {"_id": 1, "_type": "car_document", "_index": "car"}}\n{"name": "Name", "color": "Blue", "description": "Description", "type": 1}\n'
timeout = None, ignore = ()
def perform_request(self, method, url, params=None, body=None, timeout=None, ignore=()):
url = self.url_prefix + url
if params:
url = '%s?%s' % (url, urlencode(params))
full_url = self.host + url
start = time.time()
try:
kw = {}
if timeout:
kw['timeout'] = timeout
# in python2 we need to make sure the url and method are not
# unicode. Otherwise the body will be decoded into unicode too and
# that will fail (#133, #201).
if not isinstance(url, str):
url = url.encode('utf-8')
if not isinstance(method, str):
method = method.encode('utf-8')
response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
duration = time.time() - start
raw_data = response.data.decode('utf-8')
except UrllibSSLError as e:
self.log_request_fail(method, full_url, body, time.time() - start, exception=e)
raise SSLError('N/A', str(e), e)
except ReadTimeoutError as e:
self.log_request_fail(method, full_url, body, time.time() - start, exception=e)
raise ConnectionTimeout('TIMEOUT', str(e), e)
except Exception as e:
self.log_request_fail(method, full_url, body, time.time() - start, exception=e)
> raise ConnectionError('N/A', str(e), e)
E elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>: Failed to establish a new connection: [Errno 111] Connection refused)
../../.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py:89: ConnectionError
----------------------------- Captured stderr call -----------------------------
2017-05-23 09:38:37,013 Converted retries value: False -> Retry(total=False, connect=None, read=None, redirect=0)
2017-05-23 09:38:37,013 Starting new HTTP connection (1): localhost
2017-05-23 09:38:37,014 POST http://localhost:9200/_bulk?refresh=true [status:N/A request:0.001s]
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 141, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 83, in create_connection
raise err
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 73, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py", line 78, in perform_request
response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 649, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/retry.py", line 324, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/packages/six.py", line 686, in reraise
raise value
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 600, in urlopen
chunked=chunked)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 356, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.5/http/client.py", line 1106, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request
self.endheaders(body)
File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders
self._send_output(message_body)
File "/usr/lib/python3.5/http/client.py", line 934, in _send_output
self.send(msg)
File "/usr/lib/python3.5/http/client.py", line 877, in send
self.connect()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 166, in connect
conn = self._new_conn()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 150, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fa304c3bb70>: Failed to establish a new connection: [Errno 111] Connection refused
2017-05-23 09:38:37,015 > {"index": {"_id": 1, "_type": "car_document", "_index": "car"}}
{"name": "Name", "color": "Blue", "description": "Description", "type": 1}
2017-05-23 09:38:37,015 Converted retries value: False -> Retry(total=False, connect=None, read=None, redirect=0)
2017-05-23 09:38:37,015 Starting new HTTP connection (2): localhost
2017-05-23 09:38:37,016 POST http://localhost:9200/_bulk?refresh=true [status:N/A request:0.000s]
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 141, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 83, in create_connection
raise err
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 73, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py", line 78, in perform_request
response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 649, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/retry.py", line 324, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/packages/six.py", line 686, in reraise
raise value
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 600, in urlopen
chunked=chunked)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 356, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.5/http/client.py", line 1106, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request
self.endheaders(body)
File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders
self._send_output(message_body)
File "/usr/lib/python3.5/http/client.py", line 934, in _send_output
self.send(msg)
File "/usr/lib/python3.5/http/client.py", line 877, in send
self.connect()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 166, in connect
conn = self._new_conn()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 150, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fa304c3b940>: Failed to establish a new connection: [Errno 111] Connection refused
2017-05-23 09:38:37,016 > {"index": {"_id": 1, "_type": "car_document", "_index": "car"}}
{"name": "Name", "color": "Blue", "description": "Description", "type": 1}
2017-05-23 09:38:37,016 Converted retries value: False -> Retry(total=False, connect=None, read=None, redirect=0)
2017-05-23 09:38:37,016 Starting new HTTP connection (3): localhost
2017-05-23 09:38:37,016 POST http://localhost:9200/_bulk?refresh=true [status:N/A request:0.000s]
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 141, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 83, in create_connection
raise err
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 73, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py", line 78, in perform_request
response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 649, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/retry.py", line 324, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/packages/six.py", line 686, in reraise
raise value
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 600, in urlopen
chunked=chunked)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 356, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.5/http/client.py", line 1106, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request
self.endheaders(body)
File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders
self._send_output(message_body)
File "/usr/lib/python3.5/http/client.py", line 934, in _send_output
self.send(msg)
File "/usr/lib/python3.5/http/client.py", line 877, in send
self.connect()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 166, in connect
conn = self._new_conn()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 150, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fa304c3b780>: Failed to establish a new connection: [Errno 111] Connection refused
2017-05-23 09:38:37,017 > {"index": {"_id": 1, "_type": "car_document", "_index": "car"}}
{"name": "Name", "color": "Blue", "description": "Description", "type": 1}
2017-05-23 09:38:37,017 Converted retries value: False -> Retry(total=False, connect=None, read=None, redirect=0)
2017-05-23 09:38:37,017 Starting new HTTP connection (4): localhost
2017-05-23 09:38:37,017 POST http://localhost:9200/_bulk?refresh=true [status:N/A request:0.000s]
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 141, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 83, in create_connection
raise err
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/connection.py", line 73, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py", line 78, in perform_request
response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 649, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/util/retry.py", line 324, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/packages/six.py", line 686, in reraise
raise value
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 600, in urlopen
chunked=chunked)
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connectiocarsenvol.py", line 356, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.5/http/client.py", line 1106, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request
self.endheaders(body)
File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders
self._send_output(message_body)
File "/usr/lib/python3.5/http/client.py", line 934, in _send_output
self.send(msg)
File "/usr/lib/python3.5/http/client.py", line 877, in send
self.connect()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 166, in connect
conn = self._new_conn()
File "/home/artur/.virtualenvs/carsenv/lib/python3.5/site-packages/urllib3/connection.py", line 150, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fa304c3b8d0>: Failed to establish a new connection: [Errno 111] Connection refused
2017-05-23 09:38:37,018 > {"index": {"_id": 1, "_type": "car_document", "_index": "car"}}
{"name": "Name", "color": "Blue", "description": "Description", "type": 1}
============================ pytest-warning summary ============================
WC1 None pytest_funcarg__cov: declaring fixtures using "pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 4.0. Please remove the prefix and use the @pytest.fixture decorator instead.
================= 1 failed, 1 pytest-warnings in 3.01 seconds ==================
I'm very new to testing and I'm very quite a hard time getting started with elasticmock. Could you provide (either here or in the README) a small (but real) example how to use it ?
For now, I try to make it work with the following piece of code :
from unittest import TestCase
from elasticmock import elasticmock
from elasticmock import FakeElasticsearch
class TestClass(TestCase):
@elasticmock
def test_should_return_something_from_elasticsearch(self):
es = FakeElasticsearch()
self.assertTrue(es.indices.get(index="_all"))
(I'm aware indices.get
won't return a boolean, but that's not the point here)
Thank you very much anyway ! It looks like a wonderful lib !
I'm just about to switch to opensearch-py and opensearch-dsl, since I rely on elasticmock for my testing, I will probably have to fork this, but I wanted to check if there were any plans to support the AWS flavor of elasticsearch here.
Thanks for writing this library!
fake elasticsearch misses logical OR as "should" QueryType.
File "/opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/elasticmock/fake_elasticsearch.py", line 108, in _evaluate_for_compound_query_type
QueryType.get_query_type(query_type),
File "/opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages/elasticmock/fake_elasticsearch.py", line 48, in get_query_type
raise NotImplementedError(f'type {type_str} is not implemented for QueryType')
NotImplementedError: type should is not implemented for QueryType
Currently the create endpoint is not supported by the fake class.
When using with elasticsearch_dsl (7.1.0) and elasticsearch (7.5.1), calling scan() ends up with a KeyError.
for ent in search.scan():
File ".../elasticsearch_dsl/search.py", line 715, in scan
for hit in scan(
File ".../elasticsearch/helpers/actions.py", line 449, in scan
if (resp["_shards"]["successful"] + resp["_shards"]["skipped"]) < resp[
KeyError: 'skipped'
Iterable of entities to be returned.
It looks to be a result of this change in elasticsearch-py. This leads to a missing key in ElasticMock's result's _shards
statuses and may be fixable by adding "skipped": 0
here in fake_elasticsearch.py .
Happy to do a PR for the two _shards dicts, if that is the right approach
Elasticmock is great. But I can't find a proper way to use it in aioelasticsearch. It has any method?
A suggested Feature Request:
For APIs that use Elastic on the backend, the clients are often instantiated early on. It would be nice to simulate the instantiation of the client as successful, but subsequent calls receiving server errors from Elastic.
Basic example test:
def test_api_elastic_down(self):
res = self.client().post('/', json=test_data)
self.assertEqual(res.status_code, 500)
In the above test, es = Elasticsearch(hosts=es_hosts)
would have been instantiated early on, possibly when the app first started.
This would be nice for workflows where, if Elastic is down, the developer wants to send the data to a Queueing system for processing once Elastic is back up.
When trying to mock Elasticsearch bulk api:
Error
Traceback (most recent call last):
File "/home/axel/Mulesoft/arm-performance/integration_tests/functional/ElasticStatsDeployerTest.py", line 38, in test_push_stats
elastic_instance=self.elastic_instance
File "/home/axel/Mulesoft/arm-performance/src/modules/stats/online/stats_processing/ElasticStatsDeployer.py", line 56, in post_process_stats
self.push_stats_to_elastic_search(elastic_instance, actions_list)
File "/home/axel/Mulesoft/arm-performance/src/modules/stats/online/stats_processing/ElasticStatsDeployer.py", line 60, in push_stats_to_elastic_search
helpers.bulk(elastic_instance, actions_list, chunk_size=ElasticStatsDeployer.max_chunk_size, max_chunk_bytes=ElasticStatsDeployer.max_bytes_size)
File "/home/axel/Mulesoft/virtualenvs/anypoint3.5/lib/python3.5/site-packages/elasticsearch/helpers/init.py", line 195, in bulk
for ok, item in streaming_bulk(client, actions, **kwargs):
File "/home/axel/Mulesoft/virtualenvs/anypoint3.5/lib/python3.5/site-packages/elasticsearch/helpers/init.py", line 162, in streaming_bulk
for bulk_actions in _chunk_actions(actions, chunk_size, max_chunk_bytes, client.transport.serializer):
AttributeError: 'FakeElasticsearch' object has no attribute 'transport'
Also, when trying to use cluster.health methods:
Error
Traceback (most recent call last):
File "/home/axel/Mulesoft/arm-performance/integration_tests/functional/ElasticStatsDeployerTest.py", line 38, in test_push_stats
elastic_instance=self.elastic_instance
File "/home/axel/Mulesoft/arm-performance/src/modules/stats/online/stats_processing/ElasticStatsDeployer.py", line 38, in post_process_stats
if not elastic_instance.cluster.health(wait_for_status='green', request_timeout=10):
AttributeError: 'FakeElasticsearch' object has no attribute 'cluster'
I can help add this to the FakeElasticMock class, but I do not know how to in principle :)
Judging from an other issue, it's not maintained but I hope you still accept PRs. I will submit one ASAP.
Hi, sweet tool you created.
Are there any plans to support elasticsearch 8?
Steps to reproduce:
mkdir test-install-elasticmock && cd test-install-elasticmock
pipenv --two && pipenv shell
Using /usr/local/bin/python2.7 (2.7.17) to create virtualenv...
pipenv install elasticmock
Error shows up
Error: An error occurred while installing elasticmock!
Error text: Collecting elasticmock
Using cached ElasticMock-1.6.1.tar.gz (13 kB)
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. pip 21.0 will drop support for Python 2.7 in January 2021. More details about Python 2 support in pip can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support pip 21.0 will remove support for this functionality.
ERROR: Command errored out with exit status 1:
command: /Users/zzheng/.local/share/virtualenvs/test-install-elasticmock-pvSvxvgZ/bin/python2.7 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/sk/9mxz4q1x1pg9xb9vkws028s40000gn/T/pip-install-maD_Va/elasticmock/setup.py'"'"'; __file__='"'"'/private/var/folders/sk/9mxz4q1x1pg9xb9vkws028s40000gn/T/pip-install-maD_Va/elasticmock/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /private/var/folders/sk/9mxz4q1x1pg9xb9vkws028s40000gn/T/pip-pip-egg-info-ECi5s2
cwd: /private/var/folders/sk/9mxz4q1x1pg9xb9vkws028s40000gn/T/pip-install-maD_Va/elasticmock/
Complete output (5 lines):
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/private/var/folders/sk/9mxz4q1x1pg9xb9vkws028s40000gn/T/pip-install-maD_Va/elasticmock/setup.py", line 10, in <module>
with open(path.join(this_directory, 'README.md'), encoding='utf-8') as f:
TypeError: 'encoding' is an invalid keyword argument for this function
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
This is likely caused by a bug in elasticmock. Report this to its maintainers.
Suggestion:
In python2 open()
doesn't not have parameter encoding
. Should add from io import open
to avoid this error.
The README.md
isn't included in the python package, so this change e674b74#diff-2eeaed663bd0d25b7e608891384b7298 breaks installs of version 1.3.4
From what I can tell, https://elasticsearch-py.readthedocs.io/en/7.x/api.html#elasticsearch.Elasticsearch.update is not supported by the mocking client (I think when I try to call the function on the mock then it's falling through to the super impl which tries to contact a server and of course there is no server so I get ConnectionRefusedError).
In my project I've decided to instead just use .index(id=...)
to update documents, but I imagine the real client does some smarter, more efficient stuff via .update
with more bells and whistles which I'm missing out on, so would be nice to support this!
By the way, loving the tool, it's made writing my tests so much easier!
_source_includes must be included in query params of e.g. search method.
Right now only supports _source_include.
It looks like the elasticmock does not implement a mock for the CatClient.
Is there a particular reason why it hasn't been implemented yet?
I could take a stab at an implementation if there are no blockers.
I see that the mock client uses the random
package to generate new IDs for documents (specifically random.choice
).
In my project, I am generating an ES document after each test is ran, and resetting the random package's seed before each test is ran (this part is out of my control). Since the mock client is using the random package to generate IDs, then this means it generates the same ID for every test, which is not ideal for testing my document posting. I wonder if there is a way around this?
Repro (test_repro.py):
import elasticmock
import elasticsearch
import pytest
import random
@pytest.fixture
@elasticmock.elasticmock
def client(scope='session'):
c = elasticsearch.Elasticsearch()
c.indices.create('test')
return c
@pytest.fixture(autouse=True)
def setup_and_teardown(client):
random.seed(1)
yield
print(client.index(index='test', body={'test': 'test'})['_id'])
def test_1():
random.choice('abcde')
assert 1 == 1
def test_2():
random.choice('12345')
assert 1 == 1
Run with
python3 -m pip install pytest elasticmock elasticsearch
python3 -m pytest test_repro.py
Observe:
test_repro.py .K2ZWeqhFWCEPyYngFb51
.K2ZWeqhFWCEPyYngFb51
When searching and keeping only selected attributes (using _source
inside body
) the fake search method ignores this and returns all attributes. This test fails but when run agains ES server (not the mock) the test passes. You can add this test to file ./tests/test_search.py
.
def test_correct_columns(self):
self.es.index('index_for_search', doc_type=DOC_TYPE, body={'id': 1, 'data_x': 2, 'data_y': 'test'})
body = {
'query': {'term': {'id': 1}},
'_source': ['id', 'data_x'],
}
response = self.es.search(index='index_for_search', doc_type=DOC_TYPE, body=body, _source=['id', 'data_x'])
self.assertEqual(1, response['hits']['total'])
doc = response['hits']['hits'][0]['_source']
self.assertSetEqual({'id', 'data_x'}, set(doc))
I'm trying to use elasticsearch.helpers.scan with the mocked ES. It apparently doesn't work because the results don't include a _scroll_id
: https://github.com/elastic/elasticsearch-py/blob/9fe0763670633848b521ff9df6350bc811f4f110/elasticsearch/helpers/__init__.py#L367. Is it insane to think about adding this functionality, or does that get way complicated?
Implement a mock bulk
method that functions similar to the real one:
https://elasticsearch-py.readthedocs.io/en/master/api.html#elasticsearch.Elasticsearch.bulk
import elasticsearch
class FooService:
def __init__(self):
self.es = elasticsearch.Elasticsearch(hosts=[{'host': 'localhost', 'port': 9200}])
def create(self, index, body):
es_object = self.es.index(index, body)
return es_object.get('_id')
def read(self, index, id):
es_object = self.es.get(index, id)
return es_object.get('_source')
This works fine. But below code not working
from elasticsearch import Elasticsearch
class FooService:
def __init__(self):
self.es = Elasticsearch(hosts=[{'host': 'localhost', 'port': 9200}])
def create(self, index, body):
es_object = self.es.index(index, body)
return es_object.get('_id')
def read(self, index, id):
es_object = self.es.get(index, id)
return es_object.get('_source')
================================================
getting errors like this
es_object = es.index(index, settings.config["es_index_jobs_mappings"])
python/tests/elastic_search/test_es_module.py:221:
../../../../my_env/local/lib/python3.8/site-packages/elasticsearch/client/utils.py:152: in _wrapped
return func(*args, params=params, headers=headers, **kwargs)
../../../../my_env/local/lib/python3.8/site-packages/elasticsearch/client/init.py:398: in index
return self.transport.perform_request(
../../../../my_env/local/lib/python3.8/site-packages/elasticsearch/transport.py:390: in perform_request
raise e
../../../../my_env/local/lib/python3.8/site-packages/elasticsearch/transport.py:358: in perform_request
status, headers_response, data = connection.perform_request(
self = <Urllib3HttpConnection: http://localhost:9200>, method = 'POST', url = '/xxxxxxxxxxxx/_doc', params = {}
body = b'{}'
timeout = None, ignore = (), headers = {}
def perform_request(
self, method, url, params=None, body=None, timeout=None, ignore=(), headers=None
):
url = self.url_prefix + url
if params:
url = "%s?%s" % (url, urlencode(params))
full_url = self.host + url
start = time.time()
orig_body = body
try:
kw = {}
if timeout:
kw["timeout"] = timeout
# in python2 we need to make sure the url and method are not
# unicode. Otherwise the body will be decoded into unicode too and
# that will fail (#133, #201).
if not isinstance(url, str):
url = url.encode("utf-8")
if not isinstance(method, str):
method = method.encode("utf-8")
request_headers = self.headers.copy()
request_headers.update(headers or ())
if self.http_compress and body:
body = self._gzip_compress(body)
request_headers["content-encoding"] = "gzip"
response = self.pool.urlopen(
method, url, body, retries=Retry(False), headers=request_headers, **kw
)
duration = time.time() - start
raw_data = response.data.decode("utf-8", "surrogatepass")
except Exception as e:
self.log_request_fail(
method, full_url, url, orig_body, time.time() - start, exception=e
)
if isinstance(e, UrllibSSLError):
raise SSLError("N/A", str(e), e)
if isinstance(e, ReadTimeoutError):
raise ConnectionTimeout("TIMEOUT", str(e), e)
raise ConnectionError("N/A", str(e), e)
E elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7f706a2b22b0>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7f706a2b22b0>: Failed to establish a new connection: [Errno 111] Connection refused)
Tests is deployed and included in site-packages as a top level library as shown in the image below. This can cause errors when using frameworks like Serverless Framework where it attempts to symlink all top level site-packages into the root of your project directory if using the same folder structure. It seems it would be best practice to nest tests directory within the deployed package, or to exclude it from deployment.
Hello all,
When I was introducing elasticmock for unit testing, I found out that it doesn't support the "must", "should" and "must_not" features of Elasticsearch_dsl.
Has anyone experienced this issue ? If so, What do you advise me to do ?
To the elasticmock team : Do you manage to resolve this issue on the next release ? For now, what's the alternative ?
Would be really grateful if you could help me move forward :)
While I was looking at #87, and elasticmock
's lack of support for the Elasticsearch.update
function, I found that this lack of support is a bit non-obvious: the mocker doesn't implement the function, so it falls through to the super class which does the real thing, which tries to contact the server but if there is no server then it fails with ConnectionRefused. It's a little obscure to interpret this as a "not supported" error, but I wonder if there could also be worse implications for someone who does have an important running server on their local machine and accidentally starts modifying real data on that server during their tests when they believe that they are safely mocking? Either way, I wonder if it's in our power to give a more meaningful error message in these cases e.g. override all the super methods to catch a ConnectionRefused and turn it into a NotImplementedError?
Using Python 3.7, elasticsearch==7.10.0 and ElasticMock==1.6.2
Relevant part of the traceback
File "__main__", line 189, in fit_iter
elasticsearch.helpers.bulk(es, it)
File "C:\Python37\lib\site-packages\elasticsearch\helpers\actions.py", line 390, in bulk
for ok, item in streaming_bulk(client, actions, *args, **kwargs):
File "C:\Python37\lib\site-packages\elasticsearch\helpers\actions.py", line 302, in streaming_bulk
actions, chunk_size, max_chunk_bytes, client.transport.serializer
AttributeError: 'FakeElasticsearch' object has no attribute 'transport'
I am attempting to bulk create a few documents by passing this to bulk
:
{"create": {"customer_id": 1, "field2": "doc1", "_index": "my_index"}}
{"create": {"customer_id": 1, "field2": "doc2", "_index": "my_index"}}
It always returns this:
{
"errors": false,
"items": []
}
Stepping through things in a debugger takes me into fake_elasticsearch.py
where I can see the documents are never added to the internal dictionary. It's almost as if some indentation got messed up.
It goes into this block:
if any(action in line for action in ['index', 'create', 'update', 'delete']):
-- SNIP! --
...which makes sure the index exists:
if index not in self.__documents_dict:
self.__documents_dict[index] = list()
So far, so good. But that's where it stops. I mean, everything else that needs to happen is found underneath the following else
clause - in particular, the creation of the item
and adding it to both the self.__documents_dict[index]
and to the items
list.
Hello,
There had been no release of elasticmock since June 2017.
But there has been 3 PR merged since, so some improvements that are not available.
So, I think a release would be much appreciated :)
Thanks again for you project !
The elastic mock requires the elasticsearch.ElasticSearch.bulk
method includes a _type
field in the "actions" of the body
parameter. (the body is newline separated action-document combinations)
https://github.com/vrcmarcos/elasticmock/blob/master/elasticmock/fake_elasticsearch.py#L331
When running this however with a Dockerized ES server I get (version 7.10.2
)
elasticsearch/connection/base.py:193: ElasticsearchDeprecationWarning: [types removal] Specifying types in bulk requests is deprecated.
Here is the explanation for why: https://www.elastic.co/guide/en/elasticsearch/reference/7.x/removal-of-types.html
Hi All,
We are running some tests that have not changed, and it appears that must
is undefined in get_query_type
in version 1.6.1 but was not broken in version 1.5.1. Representative trace below of the issue:
.tox/py36/lib/python3.6/site-packages/elasticsearch_dsl/search.py:706: in execute
**self._params
.tox/py36/lib/python3.6/site-packages/elasticmock/behaviour/server_failure.py:27: in decorated
response = f(*args, **kwargs)
.tox/py36/lib/python3.6/site-packages/elasticsearch/client/utils.py:84: in _wrapped
return func(*args, params=params, **kwargs)
.tox/py36/lib/python3.6/site-packages/elasticmock/fake_elasticsearch.py:377: in search
if condition.evaluate(document):
.tox/py36/lib/python3.6/site-packages/elasticmock/fake_elasticsearch.py:54: in evaluate
return self._evaluate_for_query_type(document)
.tox/py36/lib/python3.6/site-packages/elasticmock/fake_elasticsearch.py:64: in _evaluate_for_query_type
return self._evaluate_for_compound_query_type(document)
.tox/py36/lib/python3.6/site-packages/elasticmock/fake_elasticsearch.py:102: in _evaluate_for_compound_query_type
QueryType.get_query_type(query_type),
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
type_str = 'must'
@staticmethod
def get_query_type(type_str):
if type_str == 'bool':
return QueryType.BOOL
elif type_str == 'filter':
return QueryType.FILTER
elif type_str == 'match':
return QueryType.MATCH
elif type_str == 'term':
return QueryType.TERM
elif type_str == 'terms':
return QueryType.TERMS
else:
> raise NotImplementedError(f'type {type_str} is not implemented for QueryType')
E NotImplementedError: type must is not implemented for QueryType
.tox/py36/lib/python3.6/site-packages/elasticmock/fake_elasticsearch.py:42: NotImplementedError
We can push out a fix, but wondering if there was a reason that must is no longer defined. For reference, we are on elasticsearch 6.5.
Cheers,
Steve
Hi all,
I'm looking for active maintainers for this library. I think that we can add 2 or 3 more people to active develop this library, or at least guide the contributors.
If you are interested, please comment in this issue.
(Mentioning all contributors to see if they had interest in it):
@xrmx @snakeye @frivoire @asherf @mohantyashish109 @mattt-b @jmlw @infinite-Joy @garncarz @barseghyanartur @tcatrain @charl-van-niekerk @bowlofstew @askoretskiy
Ty!
Using Python 3.7, elasticsearch==7.10.0 and ElasticMock==1.6.2
Match all queries don't work.
es.search({"query": {"match_all", {}}})
I have written the following unit test.
from unittest import TestCase
from elasticmock import elasticmock
from elasticsearch import Elasticsearch
def some_function_that_uses_elasticsearch():
client = Elasticsearch(hosts=[{"host": "localhost", "port": 9200}])
id = client.index("test-index", {"a": "b"})
print(id)
return True
class TestClass(TestCase):
@elasticmock
def test_should_return_something_from_elasticsearch(self):
self.assertIsNotNone(some_function_that_uses_elasticsearch())
If I run the test without a local instance of Elasticsearch running I get the following error.
elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7ff720769370>:
Failed to establish a new connection: [Errno 61] Connection refused) caused by:
NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7ff720769370>:
Failed to establish a new connection:
[Errno 61] Connection refused)
If I step through in the debugger I see that the client is an ElasticSearch
object and not a FakeElasticSearch
object. Apparently the @elasticmock
decorator has no effect.
I have the same problem with the pytest
framework.
from elasticmock import elasticmock
from elasticsearch import Elasticsearch
@elasticmock
def test_mocked_elasticsearch_client():
client = Elasticsearch(hosts=[{"host": "localhost", "port": 9200}])
id = client.index("test-index", {"a": "b"})
print(id)
The "Code Example" with FooService
on the project homepage does work. But I can't figure out what is wrong with the variants I have here.
This is elasticmock 1.8.1 and elasticsearch 7.13.3 on OS X with Python 3.9.15.
I have written a pytest
version of the sample code.
import elasticsearch
class FooService:
def __init__(self):
self.es = elasticsearch.Elasticsearch(hosts=[{'host': 'localhost', 'port': 9200}])
def create(self, index, body):
es_object = self.es.index(index, body)
return es_object.get('_id')
def read(self, index, id):
es_object = self.es.get(index, id)
return es_object.get('_source')
import pytest
from elasticmock import elasticmock
@elasticmock
def test_create_and_read_object():
# Variables used to test
index = 'test-index'
expected_document = {
'foo': 'bar'
}
# Instantiate service
service = FooService()
# Index document on ElasticSearch
id = service.create(index, expected_document)
assert id is not None
# Retrieve document from ElasticSearch
document = service.read(index, id)
assert expected_document == document
When I run it I get the following error
$ pytest em.py
========================================== test session starts ===========================================
platform darwin -- Python 3.7.6, pytest-5.4.1, py-1.8.1, pluggy-0.13.1
rootdir: /Users/wmcneill/Documents/Misc
plugins: celery-4.4.2
collected 1 item
em.py F [100%]
================================================ FAILURES ================================================
______________________________________ test_create_and_read_object _______________________________________
@elasticmock
def test_create_and_read_object():
# Variables used to test
index = 'test-index'
expected_document = {
'foo': 'bar'
}
# Instantiate service
service = FooService()
# Index document on ElasticSearch
> id = service.create(index, expected_document)
em.py:31:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
em.py:9: in create
es_object = self.es.index(index, body)
../../greenfield/document_store/venv/lib/python3.7/site-packages/elasticmock/behaviour/server_failure.py:27: in decorated
response = f(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
args = (<elasticmock.fake_elasticsearch.FakeElasticsearch object at 0x1051cc110>, 'test-index', {'foo': 'bar'})
kwargs = {}, params = {}, headers = {}, p = 'request_timeout'
@wraps(func)
def _wrapped(*args, **kwargs):
params = {}
headers = {}
if "params" in kwargs:
params = kwargs.pop("params").copy()
if "headers" in kwargs:
headers = {
k.lower(): v for k, v in (kwargs.pop("headers") or {}).items()
}
if "opaque_id" in kwargs:
headers["x-opaque-id"] = kwargs.pop("opaque_id")
for p in es_query_params + GLOBAL_PARAMS:
if p in kwargs:
v = kwargs.pop(p)
if v is not None:
params[p] = _escape(v)
# don't treat ignore, request_timeout, and opaque_id as other params to avoid escaping
for p in ("ignore", "request_timeout"):
if p in kwargs:
params[p] = kwargs.pop(p)
> return func(*args, params=params, headers=headers, **kwargs)
E TypeError: index() got an unexpected keyword argument 'headers'
../../greenfield/document_store/venv/lib/python3.7/site-packages/elasticsearch/client/utils.py:92: TypeError
======================================== short test summary info =========================================
FAILED em.py::test_create_and_read_object - TypeError: index() got an unexpected keyword argument 'head...
=========================================== 1 failed in 0.28s ============================================
This is ElasticMock version 1.5.0 and elasticsearch version 7.6.0.
Hi there,
I just wanted to know whether ElasticMock supports 'indices' or not, and if not how can I mock the same, I will send a PR if I find a solution
Hello,
I'm wondering why adding new index with the same id does not replace previous index. I found out that entry with the same id is appended to list and then when using get
it returns first entry that was inserted instead of last.
For example following test fails:
@elasticmock
def test_es_mock():
es_mock = elasticsearch.Elasticsearch(hosts=[{'host': 'localhost', 'port': 9200}])
es_mock.index(index='m', doc_type='t', id='1', body={'test': 1})
es_mock.index(index='m', doc_type='t', id='1', body={'test': 3})
es_reposne = es_mock.get(index='m', doc_type='t', id='1')
found_doc = es_reposne['_source']
assert found_doc['test'] == 3
Hi,
first of all thank you for this package. I have noticed there may have been a regression introduced in version 1.3.7
when indexing a document with a existing id
.
Here is a basic test to reproduce the problem
test_index.py
:
import unittest
from datetime import datetime
from unittest import TestCase
from elasticmock import FakeElasticsearch
class TestIndexing(TestCase):
def test_insert_document(self):
es = FakeElasticsearch()
doc = {
'author': 'kimchy',
'text': 'Elasticsearch: cool. bonsai cool.',
'timestamp': datetime.now(),
}
res = es.index(index="test-index", id=1, doc_type="_doc", body=doc)
self.assertEqual(res["_id"], 1)
if __name__ == "__main__":
unittest.main()
With elasticmock==1.3.6
this test runs fine, but with 1.3.7
I get the following result :
$ python test_index.py
E
======================================================================
ERROR: test_insert_document (__main__.TestIndexing)
----------------------------------------------------------------------
Traceback (most recent call last):
File "test_index.py", line 16, in test_insert_document
res = es.index(index="test-index", id=1, doc_type="_doc", body=doc)
File "/Users/romain/Workspaces/Python/test-elasticmock/venv/lib/python3.7/site-packages/elasticsearch/client/utils.py", line 84, in _wrapped
return func(*args, params=params, **kwargs)
File "/Users/romain/Workspaces/Python/test-elasticmock/venv/lib/python3.7/site-packages/elasticmock/fake_elasticsearch.py", line 56, in index
doc = self.get(index, id, doc_type)
File "/Users/romain/Workspaces/Python/test-elasticmock/venv/lib/python3.7/site-packages/elasticsearch/client/utils.py", line 84, in _wrapped
return func(*args, params=params, **kwargs)
File "/Users/romain/Workspaces/Python/test-elasticmock/venv/lib/python3.7/site-packages/elasticmock/fake_elasticsearch.py", line 152, in get
raise NotFoundError(404, json.dumps(error_data))
elasticsearch.exceptions.NotFoundError: NotFoundError(404, '{"_index": "test-index", "_type": "_doc", "_id": 1, "found": false}')
----------------------------------------------------------------------
Ran 1 test in 0.001s
FAILED (errors=1)
It appears that elasticmock is trying to get it from its document list, but as it's not yet created it fails with a NotFoundError
Hi @vrcmarcos, can you please guide me to implement support for async client?.
Hello,
the official elasticsearch library offers the possibility to use if_seq_no
and if_seq_no
parameters:
https://elasticsearch-py.readthedocs.io/en/master/api.html#elasticsearch.Elasticsearch.index
I think it will be nice to update this library to support those parameters for index, delete, and update operation.
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.