Giter VIP home page Giter VIP logo

senpy's Introduction

img/header.png https://readthedocs.org/projects/senpy/badge/?version=latest

Senpy lets you create sentiment analysis web services easily, fast and using a well known API. As a bonus, Senpy services use semantic vocabularies (e.g. NIF, Marl, Onyx) and formats (turtle, JSON-LD, xml-rdf).

Have you ever wanted to turn your sentiment analysis algorithms into a service? With Senpy, now you can. It provides all the tools so you just have to worry about improving your algorithms:

See it in action.

Installation

The stable version can be installed in three ways.

Through PIP

pip install -U --user senpy

Alternatively, you can use the development version:

git clone http://github.com/gsi-upm/senpy
cd senpy
pip install --user .

If you want to install Senpy globally, use sudo instead of the --user flag.

Docker Image

Build the image or use the pre-built one: docker run -ti -p 5000:5000 gsiupm/senpy.

To add custom plugins, add a volume and tell Senpy where to find the plugins: docker run -ti -p 5000:5000 -v <PATH OF PLUGINS>:/plugins gsiupm/senpy -f /plugins

Compatibility

Senpy should run on any major operating system. Its code is pure Python, and the only limitations are imposed by its dependencies (e.g., nltk, pandas).

Currently, the CI/CD pipeline tests the code on:

  • GNU/Linux with Python versions 3.7+ (3.10+ recommended for all plugins)
  • MacOS and homebrew's python3
  • Windows 10 and chocolatey's python3

The latest PyPI package is verified to install on Ubuntu, Debian and Arch Linux.

If you have trouble installing Senpy on your platform, see Having problems?.

Developing

Running/debugging

This command will run the senpy container using the latest image available, mounting your current folder so you get your latest code:

# Python 3.5
make dev
# Python 2.7
make dev-2.7

Building a docker image

# Python 3.5
make build-3.5
# Python 2.7
make build-2.7

Testing

make test

Running

This command will run the senpy server listening on localhost:5000

# Python 3.5
make run-3.5
# Python 2.7
make run-2.7

Usage

However, the easiest and recommended way is to just use the command-line tool to load your plugins and launch the server.

senpy

or, alternatively:

python -m senpy

This will create a server with any modules found in the current path. For more options, see the --help page.

Alternatively, you can use the modules included in Senpy to build your own application.

Deploying on Heroku

Use a free heroku instance to share your service with the world. Just use the example Procfile in this repository, or build your own.

For more information, check out the documentation.

Python 2.x compatibility

Keeping compatibility between python 2.7 and 3.x is not always easy, especially for a framework that deals both with text and web requests. Hence, starting February 2019, this project will no longer make efforts to support python 2.7, which will reach its end of life in 2020. Most of the functionality should still work, and the compatibility shims will remain for now, but we cannot make any guarantees at this point. Instead, the maintainers will focus their efforts on keeping the codebase compatible across different Python 3.3+ versions, including upcoming ones. We apologize for the inconvenience.

Having problems?

Please, file a new issue on GitHub including enough details to reproduce the bug, including:

  • Operating system
  • Version of Senpy (or docker tag)
  • Installed libraries
  • Relevant logs
  • A simple code example

Acknowledgement

This development has been partially funded by the European Union through the MixedEmotions Project (project number H2020 655632), as part of the RIA ICT 15 Big data and Open Data Innovation and take-up programme.

MixedEmotions Logo img/eu-flag.jpg

senpy's People

Contributors

balkian avatar cif2cif avatar drevicko avatar militarpancho avatar nachocp avatar oaraque avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

senpy's Issues

Only sentiment-140 is avaliable

I use senpy like this:
from senpy.client import Client

c = Client('http://latest.senpy.cluster.gsi.dit.upm.es/api')
#c = Client('http://127.0.0.1:5000/api')
r = c.analyse('I am sad', algorithm='sentiText')
print(r)

The an ERROR: senpy.models.Error: The algorithm 'sentitext' is not valid
Valid algorithms: dict_keys(['split', 'ekman2fsre', 'sentiment140', 'rand', 'emorand', 'ekman2pad'])

But in the test page, the algorithms like sentiText, EmoTextANEW are available. How should I use them in my own program.
Thank you!

Public demo

Make sure the public demo is available and that that all links to it are working in the documentation.

add all api params as options in the web interface

It'd be good to have the possible api parameters (eg: 'with_params') as options in the web interface. These also need to be added to the documentation, but putting them in the web interface would be a very good start (:

Deadlock in ShelfPlugin

When a shelf plugin is not properly closed, reusing the shelf file results in an error:

DBPageNotFoundError: (-30986, 'BDB0075 DB_PAGE_NOTFOUND: Requested page not found')

Convert VAD to Ekman

Hello,

Some deep learning models provide VAD values in 3D space.

However, the Ekman models is more intuitive to share the results with users.

It seems that senpy can perform the conversion between both models.

Could you please point me to the chunks of code performing this conversion?

Alternatively, are you aware of a straightforward approach to perform this conversion?

Best,

Ed

Improve threading

We are using gevent and some plugins are blocking on I/O without monkey patching. That means some operations are actually blocking the whole service.

Logo of plugin author in the senpy web interface

It would be good for plugin developers to be able to add their own logos to the web interface.

It's easy enough to change the logo file when deploying, but this would mean the gsi-upm logo would only be present embedded in the senpy logo in the header and the logo file name would still be gsi.png (ie: not the name of the plugin author).

broken shelf file causes unloadable plugin

as far as I can see, if your plugin fails to pickle it's shelf properly (eg: this issue), it then becomes impossible to load the plugin, as an EOF exception is thrown similar to this:

ERROR:senpy.extensions:Error activating plugin emotion-wnaffect -  : 
	Traceback (most recent call last):
  File "/Users/drevicko/anaconda/envs/python2/lib/python2.7/site-packages/senpy/extensions.py", line 290, in act
    plugin.activate()
  File "./emotion-wnaffect.py", line 73, in activate
    if 'total_synsets' not in self.sh:
  File "/Users/drevicko/anaconda/envs/python2/lib/python2.7/site-packages/senpy/plugins/__init__.py", line 99, in sh
    self.__dict__['_sh'] = pickle.load(open(self.shelf_file, 'rb'))
  File "/Users/drevicko/anaconda/envs/python2/lib/python2.7/pickle.py", line 1384, in load
    return Unpickler(file).load()
  File "/Users/drevicko/anaconda/envs/python2/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/Users/drevicko/anaconda/envs/python2/lib/python2.7/pickle.py", line 886, in load_eof
    raise EOFError
EOFError

It would seem a good idea to issue a warning and fall back on starting the plugin from scratch.

Add custom data to existing models

Can anyone please tell wheter custom data can be built on top of existing senpy models..or existing model can be expanded

Any help document available for it

add description of async:false for .senpy file to docs

we have a plugin that uses theano, which refuses to import into a thread (it's a gevent thing).

I see that senpy has a mechanism for loading plugins synchronously, but there doesn't seem to be a way to use it at the moment, as the only place activate_all() is called is in __main__.py and cli.py where sync is left to it's default False value or set to False.

We can change that to get things working in our case, but is there a way to force a plugin to load synchronously already implemented that I didn't find?

python 2 code hit when running in python 3

Collecting pattern
Using cached pattern-2.6.zip
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-build-losdgxhb/pattern/setup.py", line 40
print n
^
SyntaxError: Missing parentheses in call to 'print'

Error when running on Windows

I am trying to run senpy on Windows machine. I am getting the following error .

The Plugins are not getting loaded

Please guide me

ERROR:flask.app:Exception on /api/plugins/default/ [GET]
Traceback (most recent call last):
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 2292, in wsgi_app
    response = self.full_dispatch_request()
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1815, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1718, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\_compat.py", line 35, in reraise
    raise value
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1813, in full_dispatch_request
    rv = self.dispatch_request()
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1799, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "C:\Users\Administrator\AppData\Local\Programs\Python\Python36\lib\site-p
ackages\senpy\blueprints.py", line 109, in decorated_function
    if 'parameters' in response and not params['with_parameters']:
TypeError: argument of type 'NoneType' is not iterable
ERROR:flask.app:Exception on /api/plugins/default/ [GET]
Traceback (most recent call last):
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 2292, in wsgi_app
    response = self.full_dispatch_request()
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1815, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1718, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\_compat.py", line 35, in reraise
    raise value
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1813, in full_dispatch_request
    rv = self.dispatch_request()
  File "C:\Users\Administrator\AppData\Roaming\Python\Python36\site-packages\fla
sk\app.py", line 1799, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "C:\Users\Administrator\AppData\Local\Programs\Python\Python36\lib\site-p
ackages\senpy\blueprints.py", line 109, in decorated_function
    if 'parameters' in response and not params['with_parameters']:
TypeError: argument of type 'NoneType' is not iterable

plugins append all instance variables to response...

This seems like a good idea at first, but we just built a plugin that has a vocabulary as an instance variable - you can imagine how big the response is!

Two ideas:

  • provide a different mechanism for populating the 'analysis' part of the response
  • provide guidelines on where/how to store instance/class variables in the docs

For now, can you recommend how we should proceed?
Thx

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.