Giter VIP home page Giter VIP logo

botogram's People

Contributors

alcc01 avatar ceri01 avatar dependabot[bot] avatar hearot avatar knsd avatar marcobuster avatar matteob99 avatar oshadow05 avatar paolobarbolini avatar patternspandemic avatar pietroalbini avatar possebon avatar renyhp avatar teopost avatar thesharp avatar tsculpt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

botogram's Issues

Cannot init shared memories

Hi Pietro,

On the master branch, I'm running into this TypeError and cannot seem to use shared memory initializers. Any ideas?

import botogram
bot = botogram.create('<API_TOKEN>')

@bot.init_shared_memory
def test_shared_memory(shared):
    shared["test"] = "memory"

if __name__ == "__main__":
    bot.run()
Traceback (most recent call last):
  File "..../botogram/runner/processes.py", line 54, in run
    self.loop()
  File "..../botogram/runner/processes.py", line 158, in loop
    job.process(self.bots)
  File "..../botogram/runner/jobs.py", line 87, in process
    return self.func(bot, self.metadata)
  File "..../botogram/runner/jobs.py", line 96, in process_update
    bot.process(update)
  File "..../botogram/frozenbot.py", line 183, in process
    result = hook.call(self, update)
  File "..../botogram/hooks.py", line 44, in call
    return self._call(bot, update)
  File "..../botogram/hooks.py", line 175, in _call
    bot._call(self.func, chat=message.chat, message=message, args=args)
  File "..../botogram/frozenbot.py", line 227, in _call
    shared = self._shared_memory.of(self._bot_id, component)
  File "..../botogram/shared.py", line 128, in of
    self.apply_inits(component, memory)
  File "..../botogram/shared.py", line 140, in apply_inits
    init(memory)
TypeError: 'SharedMemoryInitializerHook' object is not callable

Thanks, Brad

Add ability to edit messages

The recent API change allowed to edit existing messages sent by bots. There should be some new methods to edit messages also in botogram.

Because of that, all the send* methods should also return the instance of the message you send.

message = chat.send("This message will destruct itself in 1 second")
time.sleep(1)
message.edit("*BOOM*")
message = chat.send_photo("myphoto.png", caption="A caption")
message.edit_caption("I'm changed!")

Add support for downloading files

Telegram introduced a while ago an API to download files sent to the bot, and botogram should implement it. Proposed API:

file_obj.save("test.txt")
  • Implement support for downloading files
  • Fix the photos ugly API
  • Add things to the documentation

Documentation changes wishlist

This issue contains a list of the changes I want to make to the documentation:

  • Add an installation guide for Windows (isssue #18)
  • Add dependencies installation instructions on the installation chapter
  • Rename the tutorial to quickstart
  • Add a proper tutorial
  • Add a narrative chapter about creating your bot
  • Add a narrative chapter about a bot's structure
  • Add a narrative chapter about sending messages
  • Add a narrative chapter about commands
  • Add a narrative chapter about localization (issue #46)
  • Add code example to API reference

This list may grow.

Implement shared memory

Because the botogram runner works in a multiprocessing environment, bot developers haven't any native way (excluding external services) to share data between the different workers.

Shared memory provides a simple and reliable way to do that. The API is simple, and allows synchronization of any picklable object. It will also implement a way to initialize shared memories.

@bot.init_shared_memory
def init_shared(shared):
    shared["messages"] = 0

@botogram.pass_shared
@bot.process_message
def process_message(shared, chat, message):
    shared["messages"] += 1
  • Implement the basic structure of the shared memory (not in the runner)
  • Implement shared memory in the runner
  • Fix shared memory not working on Windows (issue #3)
  • Implement init_shared_memory
  • Document the whole thing

This issue tracks the progress for this feature.

Add support for tracking conversations

Conversation tracking is the way I plan to implement a proper support for keyboards (as a layer above those) and forced replies.

@bot.command("support")
def support_command(conversation, chat, message, args):
    """Ask for support"""
    conversation.start("support")
    chat.send("Hey, what's your problem?", attach=conversation)

@bot.conversation("support")
def bug_report_conversation(conversation, chat, message):
    # First step is asking the user which problem he has
    if "problem" not in conversation.data:
        conversation.data["problem"] = message.text
        chat.send("OK! To which email address do you wish being contacted?",
                  attach=conversation)
    else:
        problem = conversation.data["problem"]
        email = message.text
        mydb.add_ticket(problem, email)

        chat.send("Thank you for contacting our support!")
        conversation.end()

Let's break this down:

  • The /support command starts the support conversation: from this moment and until you close the conversation (or it times out) every message which is detected as part of the conversation will be routed to the relative function
  • The conversation is attached (the equivalent of the current extra argument) to the message you send to the user, and this acts as a Telegram ForceReply
  • Every message the user sends now is routed to the conversation function, so you can interact with your user more easily
  • Conversations have a dedicated shared memory, unique for every conversation, so you can store any information the user sent to you (it will be deleted after you end the conversation)
  • After you did what you want with the user, you can .end() the conversation.

I think conversations will aid you creating even more awesome bots, but if you have any idea please share it here!

Add tests to the runner

The runner should be tested automatically, and tests needs to be developed for it. Unfortunately, due to the multiprocess/requests-heavy nature of it, it can be quite hard to test it properly.

Pickling Errors due to Window's spawn Multiprocessing start method

Greetings Pietro,

Fantastic work on this framework! The best I've found. I am thrilled with your most recent inclusion of components.

I have found the framework breaks on Windows with Python 3.4 due to both the bot's logging and GNUTranslations objects being non-serializable to spawned processes. I have gotten around these pickling issues by rebuilding the objects on the processes after they've started. In the case of the logger, I will likely get garbled/intermixed messages if I don't implement a queue to handle concurrent logging events. I am not very knowledgeable regarding the best practices in dealing with these concurrency issues and differing process start methods between platforms. Perhaps you are able to provide a more elegant solution to this? I have learned quite a bit reading through your code!

Thank you.
Brad

By the way, a feature I plan to integrate into my bot relies on a scheduler for messaging. Perhaps this can be useful to the framework overall? It would basically be extending the runner to include a process that uses APScheduler or the like. I could see a scheduling piece being integrated into components as well. Cheers.

Empty items in command's arguments

If you send a command to your bot, and you accidentally put double spaces in the arguments, an empty item appears in the arguments list:

# The message is "/mycommand a b  c"
args = ["a", "b", "", "c"]

There should be no empty items in the args list.

Add support for venues

The recent API updates added support for venues, so you can share exact street addresses. Support in the Message object and for seding them should be added.

Rename init_shared_memory to prepare_memory

I don't like so much the @bot.init_shared_memory/add_shared_memory_initializer names: they're too long and a bit ugly. I think a better naming is the following one:

  • @bot.init_shared_memory @bot.prepare_memory
  • self.add_shared_memory_initializer self.add_memory_preparer

This is a breaking change, but the old syntax will continue to work until 1.0.

Rework shared memory

The current status of shared memory is a real mess, and I really need to solve it:

First of all, you can't easily use sub-dictionaries because they're not synchronized, and you need to do something like this, which is really ugly:

mydata = shared["mydata"]
mydata["test"] = "abc"
shared["mydata"] = mydata

Other than that, shared memory is directly available via the shared argument of any hook: this might seem convenient, but prevents adding more shared things, like a global memory (as requested in #22) and locks, which are currently implemented by adding the method at runtime.

In order to solve this problems, and allowing to create custom drivers more easily, a big rewrite of the whole thing is needed. The new API I want for it is:

# For a component's memory
shared.memory["mydata"] = shared.object("dict")
shared.memory["mydata"]["test"] = "yay"
with shared.lock("hello"):
    pass

# A custom bucket
bucket = bot.shared.bucket("test")
bucket.memory["mydata"] = "test

  • Move the functionality of shared into shared.memory
  • Use a custom proxy instead of the multiprocessing one, so it's easier to write custom drivers and the API is consistent between them
    • Add basic support for objects management
    • Add ability to export/import objects and switch drivers
    • Add ability to add memory preparers
    • Add support for the runner driver
  • Add some sort of garbage collection

Language overhaul

Currently botogram support internationalization for its default messages, but that feature is available for a few languages only (Italian and English), it's undocumented, and it's really limited (no way to override messages without modifying the .po files, rebuilding the package and reinstalling it).

This issue tracks the progress to improve this situation.

  • Add narrative documentation about languages
  • Add some documentation about adding support for new languages
  • Add a way to override language values without recompiling the .po files (issue #47)

Cannot start shared-memory branch on Windows

Hi Pietro.

Master at ba226c seems fine. However I cannot start the shared-memory branch.

With attempting to run a stripped bot (just passing the api key), it may run but does not respond (some kind of race condition?). Other times, setting one simple component or the about / owner bot attribute, I get any of the three errors pasted bellow:

...
\runner\__init__.py", line 85, in _boot_processes
    worker.start()
  File "C:\Python34\Lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "C:\Python34\Lib\multiprocessing\context.py", line 212, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python34\Lib\multiprocessing\context.py", line 313, in _Popen
    return Popen(process_obj)
  File "C:\Python34\Lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Python34\Lib\multiprocessing\reduction.py", line 59, in dump
    ForkingPickler(file, protocol).dump(obj)
  File "C:\Python34\Lib\multiprocessing\connection.py", line 940, in reduce_pipe_connection
    dh = reduction.DupHandle(conn.fileno(), access)
  File "C:\Python34\Lib\multiprocessing\connection.py", line 170, in fileno
    self._check_closed()
  File "C:\Python34\Lib\multiprocessing\connection.py", line 136, in _check_closed
    raise OSError("handle is closed")
OSError: handle is closed
...
\runner\__init__.py", line 85, in _boot_processes
    worker.start()
  File "C:\Python34\Lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "C:\Python34\Lib\multiprocessing\context.py", line 212, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python34\Lib\multiprocessing\context.py", line 313, in _Popen
    return Popen(process_obj)
  File "C:\Python34\Lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Python34\Lib\multiprocessing\reduction.py", line 59, in dump
    ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <class 'weakref'>: attribute lookup weakref
on builtins failed
...
\runner\__init__.py", line 85, in _boot_processes
    worker.start()
  File "C:\Python34\Lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "C:\Python34\Lib\multiprocessing\context.py", line 212, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python34\Lib\multiprocessing\context.py", line 313, in _Popen
    return Popen(process_obj)
  File "C:\Python34\Lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Python34\Lib\multiprocessing\reduction.py", line 59, in dump
    ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <function BaseManager._finalize_manager at 0
x03214F18>: attribute lookup _finalize_manager on multiprocessing.managers failed

Hope this helps,
Brad

Add ability to send contacts

The recent Bots API bulk update added support for sending contacts to the users. The usual send_contact methods should be implemented.

Can't save stickers

There is currently no way to save a sticker on disk, but that's a documented feature, and a thing I want implemented anyways.

Change the API for hiding commands

Based on some feedback received by an user, the API for hiding commands from /help is not that great. Currently you need to define a normal command and then append the name of the command to the bot.hide_commands list. This works, but it would be nicer to hide the command directly from its declaration, like so:

@bot.command("hello", hidden=True)
def hello_command(chat, message, args):
    pass

Current way:

@bot.command("hello")
def hello_command(chat, message, args):
    pass

bot.hide_commands.append("hello")

Add a way to override language strings without recompiling .po files

Currently, the only way to override translation strings is to edit the .po files, rebuild those, and reinstall botogram with the rebuilt .mo files. That's a really bad UX, and it might cause problems with multiple bots.

The proposed API is:

bot.override_i18n = {
    "No description available": "Hey, that's empty!",
}

Meta issue: #46

Implement keyboards support

Keyboards are one of the best parts of the Telegram bots API, but they are nasty to implement in a generic and reliable way, because botogram needs to call the right hook when someone replies, filter all the incoming requests for actual replies and maintain an internal registry of which keboards are active at the moment.

@bot.keyboard(permanent=True)
def my_keyboard(chat, message, user, answer):
    chat.send("You replied with %s" % answer)

@bot.command("ask")
def ask_command(chat, message, args):
    kb = my_keyboard.prepare()
    kb.row("Yes", "Not")
    chat.send("Hey @pietroalbini, are you OK today?", extra=kb)

This issue tracks the progress on this feature.

Remove deprecated features

Before the release of botogram 1.0, there are some deprecated features which needs to be removed:

  • botogram.decorators.pass_bot
  • botogram.decorators.pass_shared
  • botogram.components.Component.add_shared_memory_initializer (issue #24)
  • botogram.frozenbot.FrozenBot.init_shared_memory (issue #24)
  • botogram.bot.Bot.init_shared_memory (issue #24)
  • botogram.objects.Message.from_ (issue #44)
  • botogram.objects.Message.new_chat_participant (issue #60)
  • botogram.objects.Message.left_chat_participant (issue #60)
  • botogram.bot.Bot.hide_commands (issue #19)
  • botogram.bot.Bot.send (issue #65)
  • botogram.bot.Bot.send_photo (issue #65)
  • botogram.bot.Bot.send_audio (issue #65)
  • botogram.bot.Bot.send_voice (issue #65)
  • botogram.bot.Bot.send_video (issue #65)
  • botogram.bot.Bot.send_file (issue #65)
  • botogram.bot.Bot.send_location (issue #65)
  • botogram.bot.Bot.send_sticker (issue #65)

Support incoming webhooks

A feature request for a small built-in webserver to add routes for various incoming webhooks. A common use case scenario could be announcing GitHub commits/issues/PRs or travis-ci build resuts to the chat.

Add documentation about all the upstream objects

This is a very boring task, so if anyone wants to help... :-)

  • Add documentation for botogram.User
  • Add documentation for botogram.Chat
  • Add documentation for botogram.Photo
  • Add documentation for botogram.PhotoSize
  • Add documentation for botogram.Audio
  • Add documentation for botogram.Voice
  • Add documentation for botogram.Document
  • Add documentation for botogram.Sticker
  • Add documentation for botogram.Video
  • Add documentation for botogram.Contact
  • Add documentation for botogram.Location
  • Add documentation for botogram.UserProfilePhotos
  • Add documentation for botogram.Message
  • Add documentation for botogram.ReplyKeyboardMarkup
  • Add documentation for botogram.ReplyKeyboardHide
  • Add documentation for botogram.ForceReply
  • Add documentation for botogram.Update

The future of botogram

Don't worry, botogram is not going to go away, this is about extending it!

Currently, botogram is strictly tied to the Telegram bots API. There is no problem with that, I started botogram with Telegram as the only target and I'm not going to leave it anytime soon. The thing is, the bots landscape is rapidly mutating, and more and more platform are adopting some kinds of API. Other than that, I want to avoid to lock everything into one single platform, because in the remote eventuality Telegram turns evil I don't want to see all this work wasted.

The thing I'm thinking about now is turning botogram into a more generic framework for bots, while keeping the same API. Fortunately, most of the logic of botogram is actually detached from the Telegram API, so it's simple to call an abstraction layer instead of the API directly.

I think this is the way to go, in order to embrace more platforms and guarantee a future for botogram even if Telegram slows down. But any opinion is highly appreciated!

Add support for inline queries

Telegram bots now supports inline queries, which allows the bot to provide information even if it isn't in that specific group. I plan to implement the feature with the following API:

@bot.inline(cache=60, private=True, paginate=10)
def process_inline(user, query):
    if not query:
        return False  # do not send the replies
    i = 1
    while True:
        # Send an article with `i` as the title and "A number" as the content
        yield botogram.inline_article(str(i), "A number")
        i += 1

Because it's implemented as a generator, it's possible to implement pagination framework-side without processing every item you might send. In the example above, most of the times you will process only the first 10-20 numbers, but the user is able to scroll down how much he wants.

Add support for sending messages to channels

Yesterday Telegram added support for channels in the API, so this should be implemented in botogram.

But I don't want to push changes without the ability to test them, and currently no Telegram client supports marking a bot as administrator of a channel.
Now that all the major clients supports adding bots as channel administrators, this can be implemented. The Bot.send method should be fairly easy to fix, but I also want to add an API for the ones who just want to manage the channel.

This is because creating a bot instance just for sending a message to a channel, maybe in an big script, is just a waste of resources. So I designed this API:

import botogram

chan = botogram.channel("@channel_name", "12345678:api_key")
chan.send("Hi there!")

botogram.channel will return an instance of botogram.Chat, so it won't trigger the initialization of the bot instance.

  • Design the API
  • Wait for Telegram clients to support this
  • Do the actual implementation
  • Write a chapter in the documentation

Newline handling in component's command arguments

I wrote this test component:

import botogram
import logbook

_logger_configured = False


class NewlineDebug(botogram.Component):
    component_name = "newline_debug"

    def __init__(self):
        self.logger = logbook.Logger("newline_debug")
        self.add_command("debug", self.debug)

    def debug(self, chat, message, args):
        """Send message to logger to catch a newline bug"""
        self.logger.info(message)

When there's a newline in args nothing gets send to the log. At first I typed /debug oneline. Then I typed

/debug message with newline

foobar

My logfile only had one message:

Mar 20 21:21:07 t*.t*******.o** run.sh[14184]: 21:21.07 - INFO - <botogram.objects.Message object at 0x7f222c3d19e8>

ChatMixin fails with User objects due to AttributeError

A regression in ChatMixin._get_call_args introduced by 2245bd6 causes an AttributeError: 'User' object has no attribute 'type' when attempting to use any send* method on User objects.

user = botogram.User(
    {'id': 123, 'first_name': 'User'},
    api=bot.api
)
user.send('Anything...')
File "..../botogram/objects/mixins.py", line 33, in _get_call_args
    chat_id = self.username if self.type == "channel" else self.id
AttributeError: 'User' object has no attribute 'type'

Cheers, Brad

Better handling when someone blocks the bot

If someone blocks your bot, an exception is currently raised. It would be nice to notify the code, so for example you can remove the user from the database. New proposed API:

@bot.chat_unavailable
def remove_chat(chat_id):
    # Example code, this can be anything:
    mydb.remove(chat_id)
  • Add code to trigger a dedicated exception when the user chat is blocked
  • Add the hook (with the decorator and the component method)
  • Write down a chapter of the documentation about this

Feedback is welcome.

UPDATE: New proposal. I'm strongly considering renaming this whole feature to chat_unavailable, in order to also deal with group chats from which your bot was kicked, and channels your bot is not an admin of.

Rename Message.from_ to Message.sender

Message.from_ is very ugly with that undersore at the end, and I think renaming it to Message.sender is a great idea. The old syntax will remain valid until botogram 1.0.

Add ability to manage groups

The recent update to the Bots API introduced ability to manage members of group chats: currently you can kick people from a group and unban them from a supergroup.

The API I want to use:

chat.ban(user_id)
chat.unban(user_id)

The syntax detector screws up with URLs

Currently, if you've underscores in URLs or email addresses, the syntax detector sees that as markdown, and so Telegram doesn't detect the URL as an URL. The parser should be able to discard URLs and email addresses before checking the syntax.

This is also bad because there is currently no way to disable the detector, as in #27.

/help is currently broken

On current master when I try to invoke /help from my bot, I get this traceback:

Traceback (most recent call last):
  File "/home/serana/env/lib/python3.5/site-packages/botogram/runner/processes.py", line 54, in run
    self.loop()
  File "/home/serana/env/lib/python3.5/site-packages/botogram/runner/processes.py", line 158, in loop
    job.process(self.bots)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/runner/jobs.py", line 87, in process
    return self.func(bot, self.metadata)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/runner/jobs.py", line 96, in process_update
    bot.process(update)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/frozenbot.py", line 181, in process
    result = hook.call(self, update)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/hooks.py", line 45, in call
    return self._call(bot, update)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/hooks.py", line 182, in _call
    message=message, args=args)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/frozenbot.py", line 241, in _call
    return func(**kwargs)
  File "/home/serana/env/lib/python3.5/site-packages/botogram/defaults.py", line 45, in help_command
    commands = bot._get_commands()
  File "/home/serana/env/lib/python3.5/site-packages/botogram/frozenbot.py", line 216, in _get_commands
    return self._commands
AttributeError: 'FrozenBot' object has no attribute '_commands'

Bisect: 99f24a0 is the first bad commit

Request to share selective memories from main component to other components

Hi Pietro,

Might it be possible to allow access to certain memories set directly on the bot's main component from all or some other components? Such is the case when some number of components require the same shared resource like a DB connection. I suppose I could try sub-classing a component with a memory of a DB connection, but I'm not sure the base class' shared memory will translate to its sub-classes.

I think a builtin method that doesn't require sub-classing would be more useful and easier to use anyhow. Perhaps if nothing else a simple way to allow a component to access to the main_component's shared memory. Thoughts?

Thank you,
Brad

Implement timers

Timers are a way to execute tasks repeatedly, at specified intervals. You can use them to send automatically messages to your users, clean up things, and other things. Timers will be executed directly by the workers of the runner.

@botogram.pass_bot
@bot.timer(600)
def an_hour_passed(bot):
    user = 12345  # Your user ID
    bot.send(user, "BONG")
  • Implement basic support, not in the runner (a method which tells what to execute)
  • Implement timers in the runner
  • Document the whole thing

This issue tracks the progress of this feature.

Add support for sending media messages

Currently botogram lacks the support to send media messages via the high level API. Methods which needs to be implemented:

  • send_photo
  • send_audio
  • send_voice
  • send_document
  • send_video
  • send_location

No way to disable the automatic syntax detector

Currently, if you don't provide a specific syntax when sending a message, the syntax detector kicks in and does its things, but there is no way to prevent this, if you want to explicitly disable it. There should be a way to disable it.

Automatic syntax detector not working sometimes

If there is something before the part of the message with the syntax, botogram fails to recognize the syntax. For example:

  • *test* results in "test"
  • *test* something results in "test something"
  • a *test* results in a *test* (wrong!)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.