Giter VIP home page Giter VIP logo

tschellenbach / stream-framework Goto Github PK

View Code? Open in Web Editor NEW
4.7K 210.0 542.0 8.27 MB

Stream Framework is a Python library, which allows you to build news feed, activity streams and notification systems using Cassandra and/or Redis. The authors of Stream-Framework also provide a cloud service for feed technology:

Home Page: https://getstream.io/

License: Other

Python 100.00%
activity-stream cassandra feed news news-feed big-data redis activity-feed

stream-framework's Introduction

Stream Framework

Build Status StackShare

Activity Streams & Newsfeeds

Examples of what you can build

Stream Framework is a Python library which allows you to build activity streams & newsfeeds using Cassandra and/or Redis. If you're not using python have a look at Stream, which supports Node, Ruby, PHP, Python, Go, Scala, Java and REST.

Examples of what you can build are:

  • Activity streams such as seen on Github
  • A Twitter style newsfeed
  • A feed like Instagram/ Pinterest
  • Facebook style newsfeeds
  • A notification system

(Feeds are also commonly called: Activity Streams, activity feeds, news streams.)

Stream

Build scalable newsfeeds and activity streams using getstream.io

Stream Framework's authors also offer a web service for building scalable newsfeeds & activity streams at Stream. It allows you to create your feeds by talking to a beautiful and easy to use REST API. There are clients available for Node, Ruby, PHP, Python, Go, Scala and Java. The Get Started page explains the API & concept in a few clicks. It's a lot easier to use, free up to 3 million feed updates and saves you the hassle of maintaining Cassandra, Redis, Faye, RabbitMQ and Celery workers.

Background Articles

A lot has been written about the best approaches to building feed based systems. Here's a collection of some of the talks:

Stream Framework

Installation

Installation through pip is recommended::

$ pip install stream-framework

By default stream-framework does not install the required dependencies for redis and cassandra:

Install stream-framework with Redis dependencies

$ pip install stream-framework[redis]

Install stream-framework with Cassandra dependencies

$ pip install stream-framework[cassandra]

Install stream-framework with both Redis and Cassandra dependencies

$ pip install stream-framework[redis,cassandra]

Authors & Contributors

Resources

Example application

We've included a Pinterest-like example application based on Stream Framework.

Tutorials

Using Stream Framework

This quick example will show you how to publish a "Pin" to all your followers. So let's create an activity for the item you just pinned.

from stream_framework.activity import Activity


def create_activity(pin):
    activity = Activity(
        pin.user_id,
        PinVerb,
        pin.id,
        pin.influencer_id,
        time=make_naive(pin.created_at, pytz.utc),
        extra_context=dict(item_id=pin.item_id)
    )
    return activity

Next up we want to start publishing this activity on several feeds. First of all, we want to insert it into your personal feed, and then into your followers' feeds. Let's start by defining these feeds.

from stream_framework.feeds.redis import RedisFeed


class UserPinFeed(PinFeed):
    key_format = 'feed:user:%(user_id)s'


class PinFeed(RedisFeed):
    key_format = 'feed:normal:%(user_id)s'

Writing to these feeds is very simple. For instance to write to the feed of user 13 one would do:

feed = UserPinFeed(13)
feed.add(activity)

But we don't want to publish to just one users feed. We want to publish to the feeds of all users which follow you. This action is called a "fanout" and is abstracted away in the manager class. We need to subclass the Manager class and tell it how we can figure out which users follow us.

from stream_framework.feed_managers.base import Manager


class PinManager(Manager):
    feed_classes = dict(
        normal=PinFeed,
    )
    user_feed_class = UserPinFeed

    def add_pin(self, pin):
        activity = pin.create_activity()
        # add user activity adds it to the user feed, and starts the fanout
        self.add_user_activity(pin.user_id, activity)

    def get_user_follower_ids(self, user_id):
        ids = Follow.objects.filter(target=user_id).values_list('user_id', flat=True)
        return {FanoutPriority.HIGH:ids}

manager = PinManager()

Now that the manager class is set up, broadcasting a pin becomes as easy as:

manager.add_pin(pin)

Calling this method will insert the pin into your personal feed and into all the feeds of users which follow you. It does so by spawning many small tasks via Celery. In Django (or any other framework) you can now show the users feed.

# django example

@login_required
def feed(request):
    '''
    Items pinned by the people you follow
    '''
    context = RequestContext(request)
    feed = manager.get_feeds(request.user.id)['normal']
    activities = list(feed[:25])
    context['activities'] = activities
    response = render_to_response('core/feed.html', context)
    return response

This example only briefly covered how Stream Framework works. The full explanation can be found on the documentation.

Features

Stream Framework uses Celery and Redis/Cassandra to build a system with heavy writes and extremely light reads. It features:

  • Asynchronous tasks (All the heavy lifting happens in the background, your users don't wait for it)
  • Reusable components (You will need to make tradeoffs based on your use cases, Stream Framework doesn't get in your way)
  • Full Cassandra and Redis support
  • The Cassandra storage uses the new CQL3 and Python-Driver packages, which give you access to the latest Cassandra features.
  • Build for the extremely performant Cassandra 2.1. 2.2 and 3.3 also pass the test suite, but no production experience.

stream-framework's People

Contributors

adamn avatar ammsa avatar anislav avatar ashwinrajeev avatar dkingman avatar dmexs avatar ernestofgonzalez avatar ferhatelmas avatar gumuz avatar ivanchenkodmitry avatar izhan avatar jeltef avatar julienpalard avatar kenhoff avatar magnusknutas avatar orf avatar pterk avatar reneklacan avatar saltduck avatar tbarbugli avatar timgates42 avatar tschellenbach avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

stream-framework's Issues

python-statsd 'module' object has no attribute 'Connection'

When I tried to run my django project using Feedly (I used pip v1.4.1 to install) it reported an ImportError caused by statsd, so I installed it as well as the python-statsd using pip. Now it gives me an "AttributeError: 'module' object has no attribute 'Connection'" on:

feedly/metrics/python_statsd.py", line 22, in init
statsd.Connection.set_defaults(host=host, port=port)

What can I do to fix this? I'm using Django 1.6.1 and python 2.7.4.

Thanks!

UUIDs as object ids (object_id is too long?)

Hi

i've developed a feed using Feedly, so far everything very nice :) The problem I have now is that changed the IDs in my tests with string UUID and now all tests fail. I realized that Feedly is expecting either an integer or an object with an 'id' attribute, so I wrapped the string in the an object that has an 'id' using int representation of the UUID. The problem is that now feedly raises this error:

TypeError: Fatal: object_id / verb have too many digits !

Seen the code, it checks that the id must be less than 10^10. What is the reason for this limit? Is there another way to use UUIDs?

Thanks!

Change activity time

Hello again,

I have one small question regarding feedly. I am using feedly in following way:

  • I have users which are grouped in a group
  • Users can share from group to another group they are part of

I have partitioned the ID's among the groups and users so ID's starting from 1 to 100000000 are considered regular users and all ID's from 100000000 to 500000000 are considered groups.

When there is a new content for group A I put the activity into the user feed of ID group A and fan out to all members of the group A. Now when the user in group A shares the content to the group B i will do the same, so I will put the activity into the user feed of group B and fanout to all users of group B.

The problem is that when this happens, the shared activity is not put into the top of the user feed of group B but it's put wherever the original time was. Is there some preferred way how to achieve this ? I am affraid that if I create completely new activity I would end up with duplicities. As let's say user 1 is in both group A and B , then during the fanout of the group B he would get the new activity for the original content which is not correct.

Please advice and thanks for the great piece of code , really appreciate this..

Regards,

Jorge

NotificationFeed max_length

Hi,
could you please tell me for what is the max_length variable for? Is it the max number of activities a user can have in his notification feed?

Thanks!

Preventing double notifications and marking notifications as unseen

I have the following use case: Two parties are having a chat (target). Feedly notification feed is used to notify the user when other party (object) has sent (verb) in a new chat message. The receiving party is offline, so we'll add a notification "new messages in this chat."

However how I can prevent getting double notifications in the notification feed so that

  • If the target, object and verb are the same (though time might be different)
  • Notification is not added to the notification feed, but the latest target, object, verb notification is bumped up and marked as unseen

Custom AggregatedActivity not working

I have created a CustomAggregatedActivity extended from feedly.activity.AggregatedActivity:

from feedly.activity import AggregatedActivity

class CustomAggregatedActivity(AggregatedActivity):
    @property
    def abc(self):
        pass

And then i have used this in my custom aggregator:

class CustomAggregator(BaseAggregator):
    aggregation_class = CustomAggregatedActivity

    def rank(self, aggregated_activities):
        # rank logic here

     def group(self, activity):
         # group logic here

Then i have assigned the CustomAggregator in:

class AggregatedUserFeed(RedisAggregatedFeed):
    aggregator_class = CustomAggregator
    key_format = 'feed:aggregated:%(user_id)s'

and finally:

class Newsfeed(Feedly):
    feed_classes = dict(
        normal=NormalUserFeed,
        aggregated=AggregatedUserFeed
    )
    user_feed_class = UserFeed
newsfeed_stream = Newsfeed()

Now when i get user aggregated feed:

feed = newsfeed_stream.get_feeds(user_id)['aggregated']
aggregated_activities = list(feed[:10])
>>> aggregated_activities[0].abc

It says:

AttributeError: 'AggregatedActivity' object has no attribute 'abc'

Can you tell me why this happened?

Thanks!

entity module is missing?

Hi

I try feedly but it misses "entity" module.

I've found none documentation on it on internet

Cheers

Timeline reversed

Not sure how and why but I get the timeline of the user normal feed reversed:

Below is the feed for the user with id 2, the first record is with ID 1000000074 while the smallest is actually 1000000064.

redis 127.0.0.1:6379> zrevrange feed:normal:2 0 100

  1. "13907862970001000000074005"
  2. "13907862960001000000073005"
  3. "13907862940001000000072005"
  4. "13907862930001000000071005"
  5. "13907862920001000000070005"
  6. "13907862910001000000069005"
  7. "13907862900001000000068005"
  8. "13907862890001000000067005"
  9. "13907862880001000000066005"
  10. "13907862870001000000065005"
  11. "13907862850001000000064005"

I am using the pinterest example as a base, here is my feedly setup:

from feedly.aggregators.base import RecentVerbAggregator
from feedly.feeds.redis import RedisFeed
from feedly.feeds.aggregated_feed.redis import RedisAggregatedFeed


class MMSFeed(RedisFeed):
    key_format = "feed:normal:%(user_id)s"


class AggregatedMMSFeed(RedisAggregatedFeed):
    aggregator_class = RecentVerbAggregator
    key_format = "feed:aggregated:%(user_id)s"


class UserMMSFeed(MMSFeed):
    key_format = "feed:user:%(user_id)s"

And here:

from feedly.feed_managers.base import Feedly
from feedly.feed_managers.base import FanoutPriority
from gruppu.models import Content, Stream
from gruppu.mms_feed import AggregatedMMSFeed, MMSFeed, \
    UserMMSFeed


class MMSFeedly(Feedly):
    # this example has both a normal feed and an aggregated feed (more like
    # how facebook or wanelo uses feeds)
    feed_classes = dict(
        normal=MMSFeed,
        aggregated=AggregatedMMSFeed
    )
    user_feed_class = UserMMSFeed

    def add_mms(self, pin):
        activity = pin.create_activity()
        # add user activity adds it to the user feed, and starts the fanout
        self.add_user_activity(pin.origstream_id, activity)

    def remove_mms(self,pin,user):
        activity = pin.create_activity()
    if user == pin.origstream.streamOwner:
            # removes the pin from the user's followers feeds
            self.remove_user_activity(pin.origstream_id, activity)
    else:
        for feed_class in self.feed_classes.values():
            user_feed = feed_class(user.id)
            user_feed.remove(activity)
    def get_user_follower_ids(self, activity, user_id):
    author = activity.extra_context['owner_id']
        ids = list(Stream.objects.filter(pk=user_id).values_list('groupers',flat=True))
    ids.append(author)
    print ids
        return {FanoutPriority.HIGH:ids}

feedly = MMSFeedly()

Inconsistent serialization/deserialization (daylight saving bug?)

Hi,
Suddenly my test where almost all failing in a strange way and after a lot of debugging I think a discovered a bug regarding serialization and deserialization of activities. The problem is that datetime_to_epoch is not the inverse of epoch_to_datetime for ALL dates. Check this out:

datetime_to_epoch(datetime(2014, 3, 20, 11, 43, 20, 0)) returns 1395337400
epoch_to_datetime(float(1395337400)) return 2014-03-20 12:43:20

After trying a some cases by hand, the problem seems to appear after 9 March 2014:

epoch_to_datetime(float(datetime_to_epoch(datetime(2014, 3, 8, 11, 43, 20, 0)))) returns 2014-03-08 11:43:20
epoch_to_datetime(float(datetime_to_epoch(datetime(2014, 3, 9, 11, 43, 20, 0)))) returns 2014-03-08 12:43:20

Probably this behaviour is related to daylight saving.

Force timestamp datetimes to be offset-naive or offset-aware

Python has a troublematic history with timezones. You can have two kinds of datetime objects - with or without a timezone. You cannot compare between these twos. Your code will break funnily when two datetimes clashes.

I recommend that Feedly takes the approach where all timestamps are either offset-naive or offset-aware (e.g. datetime.datetime.now() vs. datetime.datetime.utcnow()) and validates this assumption for all inputs which can persist datetimes.

Below is an example when I entered activity.time as timezone-aware datetime. The activity is stored fine in the feed, but when you try to add another activity it breaks down.

  File "/Users/moo/code/foobar/venv/lib/python2.7/site-packages/feedly/feeds/aggregated_feed/base.py", line 83, in add_many
    current_activities, activities)
  File "/Users/moo/code/foobar/venv/lib/python2.7/site-packages/feedly/aggregators/base.py", line 81, in merge
    new_aggregated.append(activity)
  File "/Users/moo/code/foobar/venv/lib/python2.7/site-packages/feedly/activity.py", line 286, in append
    if self.updated_at is None or activity.time > self.updated_at:
TypeError: can't compare offset-naive and offset-aware datetimes

When update_at (Feedly internal?) is instated it does not have timezone information, thus leading to this error.

Feedle and social graph questions

I have following model for social picture sharing project:

Group holds users
User can be subscribed to multiple groups
User can share the items (picture) from one group to another group

I am using feedly to hold the feeds of images for users. I have implemented it that the groups are treated the same way as users so in case of new picture the activity user_id is equal to the id of the group ( I had to partition the ids among the groups and real users to prevent collisions). The get_user_follower_ids then returns all the users in the group.

The problem is when subscriber decides to remove the picture. Consider following scenario:

Picture is shared to group A, user 1 in group A shares it to group B, user 2 in group B then shares it to the group C.

When the original submitter of the picture decides to remove it from the site I would have to traverse all the followers of the group to which the original user submitted the picture.

Is there any preferred way how to deal with this ? I was suggested by Tommaso to use some graph DB to get the followers.

Thanks,

Jorge

Wrong insertion into followers feed?

Hi!
I manage to get the get_user_feed() work properly, but when I try to fetch followers feeds, something strange happens. I think it might be an issue connected to fanout/insertion into feedly: Ive used the pin_feedly example as a starting point.

Here is some code:
models.py:

Test implementation of feedly:

class WishFeed(RedisFeed):
key_format = 'feed:normal:%(user_id)s'

class UserWishFeed(WishFeed):
key_format = 'feed:user:%(user_id)s'

class UserWishFeedly(Feedly):
feed_classes = dict(
normal=WishFeed,
)

user_feed_class = UserWishFeed

def add_wish(self,user,activity):
    # add user activity adds it to the user feed, and starts the fanout
    self.add_user_activity(user.id, activity)

def get_user_follower_ids(self, user_id):
    from email_auth.models import GiftItEndUser
    friends = Friend.objects.filter(Q(from_user=user_id) | Q(to_user=user_id))
    f_ids = []
    if friends.exists():
        for f in friends:
            if f.to_user.pk != user_id:
                f_ids.append(f.to_user.pk)
            if f.from_user.pk != user_id:
                f_ids.append(f.from_user.pk)
    ids = GiftItEndUser.objects.filter(id__in=f_ids).values_list('id',flat=True)
    return {FanoutPriority.HIGH:ids}

views.py
@csrf_exempt
@api_view(['GET'])
@login_required
def friends_wish_feed(request, _args, *_kwargs):
feed = feedly.get_feeds(request.user.id)['normal']
act_list = []
json_data = []
for acts in feed[:25]:
act_list.append(acts)
json_data = json.dumps(act_list)
return Response(json_data)

this is how I insert into the feeds:
views.py
...
activity = Activity(wishlist.user,WishVerb,in_prod.id)
feed = UserWishFeed(wishlist.user.id)
feed.add(activity)
feedly.add_wish(wishlist.user, activity)
...

from redis-cli after insertion i will get this:
127.0.0.1:6379> keys *

  1. "global:3"
  2. "global:4"
  3. "feed:user:7"
  4. "global:7"
  5. "global:9"

if i query 3) :
127.0.0.1:6379> zrange "feed:user:7" 0 1

  1. "13873587370000000000003005"
  2. "13873587410000000000001005"

127.0.0.1:6379> hgetall "global:3"

  1. "13873587470000000000006005"
  2. "7,5,6,0,1387358747,"

Seems like the insertion went fine, except that it did not define the correct key_format?
How can I fix this?

Why does it say "global:" instead of "user:normal:" like I defined in WishFeed(RedisFeed) ?

The function friends_wish_feed() returns [] every time..

Help appreciated!! :)

Fanout operation fails

I get this error:

[2014-02-11 01:35:52,162: WARNING/MainProcess] celery@gruppuco ready.
[2014-02-11 21:05:36,083: ERROR/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[4e340e7d-16d7-417d-9441-fc928d3c3d98] raised unexpected: TypeError("add_many() got an unexpected keyword argument 'batch_interface'",)
Traceback (most recent call last):
  File "/usr/site/gruppe/lib/python2.7/site-packages/celery/app/trace.py", line 238, in trace_task
    R = retval = fun(*args, **kwargs)
  File "/usr/site/gruppe/lib/python2.7/site-packages/celery/app/trace.py", line 416, in __protected_call__
    return self.run(*args, **kwargs)
  File "/usr/site/gruppe/project/feedly/tasks.py", line 17, in fanout_operation_hi_priority
    return fanout_operation(feed_manager, feed_class, user_ids, operation, operation_kwargs)
  File "/usr/site/gruppe/lib/python2.7/site-packages/celery/local.py", line 167, in <lambda>
    __call__ = lambda x, *a, **kw: x._get_current_object()(*a, **kw)
  File "/usr/site/gruppe/lib/python2.7/site-packages/celery/app/trace.py", line 417, in __protected_call__
    return orig(self, *args, **kwargs)
  File "/usr/site/gruppe/lib/python2.7/site-packages/celery/app/task.py", line 419, in __call__
    return self.run(*args, **kwargs)
  File "/usr/site/gruppe/project/feedly/tasks.py", line 11, in fanout_operation
    feed_manager.fanout(user_ids, feed_class, operation, operation_kwargs)
  File "/usr/site/gruppe/project/feedly/feed_managers/base.py", line 341, in fanout
    operation(feed, **operation_kwargs)
  File "/usr/site/gruppe/project/feedly/feed_managers/base.py", line 23, in add_operation
    feed.add_many(activities, batch_interface=batch_interface, trim=trim)
TypeError: add_many() got an unexpected keyword argument 'batch_interface'

I tried one more time with another content and this time the fanout operation went without issues:

[2014-02-11 21:13:24,373: INFO/MainProcess] Received task: feedly.tasks.fanout_operation_hi_priority[3bb58fc6-35d4-4775-a8c0-75b3f1cbf121]
[2014-02-11 21:13:24,374: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0xa28c64c> (args:('feedly.tasks.fanout_operation_hi_priority', '3bb58fc6-35d4-4775-a8c0-75b3f1cbf121', [], {'user_ids': (2, 3), 'operation_kwargs': {'trim': True, 'activities': [Activity(messaged) 100000002 1000000149]}, 'feed_manager': <gruppu.mms_feedly.MMSFeedly object at 0xac7b20c>, 'operation': <function add_operation at 0xaaac56c>, 'feed_class': <class 'gruppu.mms_feed.AggregatedMMSFeed'>}, {'timelimit': (None, None), 'utc': True, u'is_eager': False, 'chord': None, u'group': None, 'args': [], 'retries': 0, u'delivery_info': {u'priority': None, u'redelivered': False, u'routing_key': u'celery', u'exchange': u'celery'}, 'expires': None, u'hostname': 'celery@gruppuco', 'task': 'feedly.tasks.fanout_operation_hi_priority', 'callbacks': None, u'correlation_id': u'3bb58fc6-35d4-4775-a8c0-75b3f1cbf121', 'errbacks': None, u'reply_to': u'692cb976-a501-3e2b-94cf-8b1fa40ebc8c', 'taskset': None, 'kwargs': {'user_ids': (2, 3), 'operation_kwargs': {'trim': True, 'activities': [Activity(messaged) 100000002 1000000149]}, 'feed_manager':... kwargs:{})
[2014-02-11 21:13:24,375: INFO/Worker-2] ============================== starting fanout ==============================
[2014-02-11 21:13:24,375: INFO/Worker-2] starting batch interface for feed <class 'gruppu.mms_feed.AggregatedMMSFeed'>, fanning out to 2 users
[2014-02-11 21:13:24,375: DEBUG/Worker-2] now handling fanout to user 2
[2014-02-11 21:13:24,375: DEBUG/Worker-2] running <gruppu.mms_feed.AggregatedMMSFeed object at 0xac5da4c>.add_many operation for 1 activities batch interface <redis.client.StrictPipeline object at 0xac5db0c> and trim True
[2014-02-11 21:13:24,376: DEBUG/MainProcess] Task accepted: feedly.tasks.fanout_operation_hi_priority[3bb58fc6-35d4-4775-a8c0-75b3f1cbf121] pid:29676
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13921593050001000000206005 from global:7
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13920982030001000000205005 from global:7
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13917558610001000000180005 from global:8
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13917595540001000000186005 from global:2
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13913356710001000000149005 from global:1
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13913360690001000000150005 from global:2
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13913369420001000000151005 from global:1
[2014-02-11 21:13:24,377: DEBUG/Worker-2] getting field 13913376780001000000157005 from global:8
[2014-02-11 21:13:24,378: INFO/Worker-2] reading 20 items took 0.00281310081482
[2014-02-11 21:13:24,379: INFO/MainProcess] Received task: feedly.tasks.fanout_operation_hi_priority[4821964b-c912-43c7-9b6b-19f6c25234f1]
[2014-02-11 21:13:24,380: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0xa28c64c> (args:('feedly.tasks.fanout_operation_hi_priority', '4821964b-c912-43c7-9b6b-19f6c25234f1', [], {'user_ids': (2, 3), 'operation_kwargs': {'trim': True, 'activities': [Activity(messaged) 100000002 1000000149]}, 'feed_manager': <gruppu.mms_feedly.MMSFeedly object at 0xac7bccc>, 'operation': <function add_operation at 0xaaac56c>, 'feed_class': <class 'gruppu.mms_feed.MMSFeed'>}, {'timelimit': (None, None), 'utc': True, u'is_eager': False, 'chord': None, u'group': None, 'args': [], 'retries': 0, u'delivery_info': {u'priority': None, u'redelivered': False, u'routing_key': u'celery', u'exchange': u'celery'}, 'expires': None, u'hostname': 'celery@gruppuco', 'task': 'feedly.tasks.fanout_operation_hi_priority', 'callbacks': None, u'correlation_id': u'4821964b-c912-43c7-9b6b-19f6c25234f1', 'errbacks': None, u'reply_to': u'692cb976-a501-3e2b-94cf-8b1fa40ebc8c', 'taskset': None, 'kwargs': {'user_ids': (2, 3), 'operation_kwargs': {'trim': True, 'activities': [Activity(messaged) 100000002 1000000149]}, 'feed_manager':... kwargs:{})
[2014-02-11 21:13:24,380: INFO/Worker-2] merge took 0.00186204910278
[2014-02-11 21:13:24,380: DEBUG/Worker-2] now updating from diff new: 0 changed: 0 deleted: 0
[2014-02-11 21:13:24,380: INFO/Worker-1] ============================== starting fanout ==============================
[2014-02-11 21:13:24,381: DEBUG/Worker-2] removed 0, added 0 items from feed <gruppu.mms_feed.AggregatedMMSFeed object at 0xac5da4c>
[2014-02-11 21:13:24,381: DEBUG/Worker-2] add many operation took 0.00526094436646 seconds
[2014-02-11 21:13:24,381: DEBUG/Worker-2] now handling fanout to user 3
[2014-02-11 21:13:24,381: INFO/Worker-1] starting batch interface for feed <class 'gruppu.mms_feed.MMSFeed'>, fanning out to 2 users
[2014-02-11 21:13:24,381: DEBUG/Worker-1] now handling fanout to user 2
[2014-02-11 21:13:24,381: DEBUG/Worker-2] running <gruppu.mms_feed.AggregatedMMSFeed object at 0xac2564c>.add_many operation for 1 activities batch interface <redis.client.StrictPipeline object at 0xac5db0c> and trim True
[2014-02-11 21:13:24,381: DEBUG/Worker-1] running <gruppu.mms_feed.MMSFeed object at 0xac5d7ec>.add_many operation for 1 activities batch interface <redis.client.StrictPipeline object at 0xac5d8cc> and trim True
[2014-02-11 21:13:24,381: DEBUG/Worker-1] adding to feed:normal:2 with score_value_chunk (13913356710001000000149005L, 13913356710001000000149005L)
[2014-02-11 21:13:24,381: DEBUG/Worker-2] getting field 13921593050001000000206005 from global:7
[2014-02-11 21:13:24,381: DEBUG/Worker-2] getting field 13920973950001000000203005 from global:8
[2014-02-11 21:13:24,381: DEBUG/Worker-2] getting field 13920978130001000000204005 from global:2
[2014-02-11 21:13:24,382: DEBUG/Worker-2] getting field 13918351650001000000187005 from global:3
[2014-02-11 21:13:24,382: DEBUG/Worker-2] getting field 13913369420001000000151005 from global:1
[2014-02-11 21:13:24,382: DEBUG/MainProcess] Task accepted: feedly.tasks.fanout_operation_hi_priority[4821964b-c912-43c7-9b6b-19f6c25234f1] pid:29675
[2014-02-11 21:13:24,382: DEBUG/Worker-1] add many operation took 0.000946998596191 seconds
[2014-02-11 21:13:24,382: DEBUG/Worker-1] now handling fanout to user 3
[2014-02-11 21:13:24,382: DEBUG/Worker-1] running <gruppu.mms_feed.MMSFeed object at 0xac5d54c>.add_many operation for 1 activities batch interface <redis.client.StrictPipeline object at 0xac5d8cc> and trim True
[2014-02-11 21:13:24,382: DEBUG/Worker-1] adding to feed:normal:3 with score_value_chunk (13913356710001000000149005L, 13913356710001000000149005L)
[2014-02-11 21:13:24,382: DEBUG/Worker-1] add many operation took 0.000346899032593 seconds
[2014-02-11 21:13:24,382: INFO/Worker-1] finished fanout for feed <class 'gruppu.mms_feed.MMSFeed'>
[2014-02-11 21:13:24,383: INFO/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[4821964b-c912-43c7-9b6b-19f6c25234f1] succeeded in 0.00260832696222s: "2 user_ids, <class 'gruppu.mms_feed.MMSFeed'>, <function add_operation at 0xaaac56c> ({'trim': True, 'activities':...
[2014-02-11 21:13:24,383: INFO/Worker-2] reading 20 items took 0.00196599960327
[2014-02-11 21:13:24,383: INFO/Worker-2] merge took 0.000265836715698
[2014-02-11 21:13:24,383: DEBUG/Worker-2] now updating from diff new: 0 changed: 1 deleted: 0
[2014-02-11 21:13:24,383: DEBUG/Worker-2] removing value v35-2014-02-02;;1391336942.0;;1391336942.0;;-1;;-1;;13913369420001000000151005;;0 from feed:aggregated:3
[2014-02-11 21:13:24,384: DEBUG/Worker-2] adding to feed:aggregated:3 with score_value_chunk (1391336942L, 'v35-2014-02-02;;1391336942.0;;1391336942.0;;-1;;-1;;13913369420001000000151005;13913356710001000000149005;;0')
[2014-02-11 21:13:24,384: DEBUG/Worker-2] removed 1, added 1 items from feed <gruppu.mms_feed.AggregatedMMSFeed object at 0xac2564c>
[2014-02-11 21:13:24,384: DEBUG/Worker-2] add many operation took 0.00320100784302 seconds
[2014-02-11 21:13:24,384: INFO/Worker-2] finished fanout for feed <class 'gruppu.mms_feed.AggregatedMMSFeed'>
[2014-02-11 21:13:24,384: INFO/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[3bb58fc6-35d4-4775-a8c0-75b3f1cbf121] succeeded in 0.00985549599864s: "2 user_ids, <class 'gruppu.mms_feed.AggregatedMMSFeed'>, <function add_operation at 0xaaac56c> ({'trim': True, 'activities':...

This is maybe because of my notification feed:

class MyAggregator(BaseAggregator):
    '''
    Aggregates based on the same verb and same time period
    '''
    def rank(self, aggregated_activities):
        '''
        The ranking logic, for sorting aggregated activities
        '''
        aggregated_activities.sort(key=lambda a: a.updated_at, reverse=True)
        return aggregated_activities

    def get_group(self, activity):
        '''
        Returns a group based on the day and verb
        '''
        verb = activity.verb.id
        date = activity.time.date()
        group = '%s-%s' % (verb, date)
        return group

class MyNotificationFeed(RedisNotificationFeed):
    # : they key format determines where the data gets stored
    key_format = 'feed:notification:%(user_id)s'

    # : the aggregator controls how the activities get aggregated
    aggregator_class = MyAggregator

The notificaton feed add_many doesn't have the batch_interface.

    def add_many(self, activities):
        '''
        Similar to the AggregatedActivity.add_many
        The only difference is that it denormalizes a count of unseen activities
        '''

Kind of strange ..

SerializationException: 7

Hello ,

I'm suddenly got this error but don't know how to fix it:

File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/feedly/serializers/aggregated_activity_serializer.py", line 104, in loads
raise SerializationException(msg)

SerializationException: 7

is there someway to by pass this?

serialization id mismatch

Hi,

It seems like serialization ids mismatch by 'L' letter, so what would be the possible cause? Is feedly hitting redis twice? Here is the step in trace of the run

> get_user_feed(customer.user.id)[:10]
[Activity(joined) 7 5]
(Pdb) activity_list = self.activity_storage.get_many(activity_ids)
(Pdb) activity_ids
['13778137560000000000005005']
(Pdb) activity_data
{13778173560000000000005005L: Activity(joined) 7 5}
-> return [activity.get_hydrated(activity_data) for activity in activities]
KeyError: (13778137560000000000005005L,)

Feedly version:
feedly==0.9.3

Thanks

Redis+Celery+feedly setup issue?

Hi again!
Im working on a django project (more info in issue #16 )
I've got my Celery worker up and running, my redis seems database ok and most of my setup seems fine.. The only problem is that when I try to "fanout" to new feeds, the feeds does not seem to reach the Celery worker.. (seems no tasks are added to the worker queue). I know this might be a broker/worker issue, but I haven't found any good indications of what the problem might be :( Everything works fine when CELERY_ALWAYS_EAGER == True , so I don't think there is a problem with my code..

for feedly to work properly with celery and redis the things I would need is a Celery worker running, redis db setup ok, and a task broker (redis or rabbitmq) , right? I think I lack the understanding of how the broker and worker works together with feedly..?

Any tips are much appreciated :)

Adding to feed:
activity = Activity(wishlist.user,WishVerb,in_prod.id)
feed = UserWishFeed(wishlist.user.id)
feed.insert_activity(activity)
feed.add(activity)
feedly.add_wish(wishlist.user, activity)

A view for fetching friends-feed :
@csrf_exempt
@api_view(['GET'])
@login_required
def friends_wish_feed(request, _args, *_kwargs):
user = GiftItEndUser.objects.get(pk=kwargs['pk'])
if request.user.id != user.pk:
content = {"message":"you do not have permission","status":0}
json_data = json.dumps(content)
return Response(json_data)

    feed = feedly.get_feeds(request.user.id)['normal']
    act_list = []
    json_data = []
    activities = list(feed[:25])
    for act in activities:
            act = {'user_id':act.actor_id,'product_id':act.object_id}
            act_list.append(act)
            json_data = json.dumps(act_list)
    return Response(json_data)

Some celery settings:
FEEDLY_NYDUS_CONFIG = {
'CONNECTIONS': {
'redis': {
'engine': 'nydus.db.backends.redis.Redis',
'router': 'nydus.db.routers.redis.PrefixPartitionRouter',
'hosts': {
0: {'prefix': 'default', 'db': 0, 'host': '127.0.0.1', 'port': 6379},
}
},
}
}

BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

Another sorting of feed not based on the time

Hello,

thanks for the great piece of code. I am using feedly to create a feed of content (links, photos, videos etc) all is good. Now I would like to make some sort of ranking of the feed and resort the feed based on number of shares, upvotes, downvotes etc. I would think that creating a new feed just for this top content would be ok.

What I would think of is to change following code to reflect the scores to be based on the ranking and not on the serialization id as it is right now.

./storage/redis/timeline_storage.py:

def add_to_storage(self, key, activities, batch_interface=None):
    cache = self.get_cache(key)
    # turn it into key value pairs
    scores = map(long, activities.keys())
    score_value_pairs = zip(scores, activities.values())
    result = cache.add_many(score_value_pairs)

Not thread safe?

Just for testing i have set feed max_length=1000 and merge_max_length=1000, why? because duplicate aggregated activities is not an option for me in newsfeed (see on facebook newsfeed you will never see a duplicate activity). So i ran a program which reads existing user actions from database and generates feeds using celery (async tasks).

I was expecting that max_length=merge_max_length would never create a duplicate aggregated activity, but the program issue so many celery tasks that in the end there were some duplicate aggregated activities. Why that happen?

Why there is even merge_max_length variable? I thought we were storing activities in redis as key, value where key is a group? so why there is a need to traverse through all aggregated activities to find where the new aggregated activity belong? A new aggregated activity already have group defined so why not do something like this as we do in python dict.get(group)?

[REDIS] remove nydus

Remove nydus and suggest twemproxy as alternative for pipelining and sharding.
We probably need to add some support for hash_tag and add some docs and examples.

cassandra-driver dependency error while installing Feedly

This was the error message I got while doing pip install Feedly and doing pip install "feedly/github/repo"

Currently I have clone the repo and in setup.py edited
install_requires = [
'cqlengine==0.8.7',
'redis>=2.8.0',
'celery',
]

to

install_requires = [
'redis>=2.8.0',
'celery',
]

And installed via python setup.py install. I dont suppose there should be any problems here. Can you suggest any better alternative ?

Feed subclassing

If you subclass the feed but leave the base storage classes there is no clear error message (Other than this method is not implemented)

@tbarbugli what do you think?

Influercer in notification activities

I try to wrap my head around here:

    feed = MyNotificationFeed(user_id)
    activity = Activity(
        love.user_id, LoveVerb, love.id, love.influencer_id,
        time=love.created_at, extra_context=dict(entity_id=self.entity_id)
    )
    feed.add(activity)

https://feedly.readthedocs.org/en/latest/notification_systems.html#what-is-a-notification-system

Looks like Activity class does not have native support for influencer, or am I looking for a wrong class (feedly.activity.Activity)? Is the code changed since the example was written or am I missing something?

Duplicate activity should update the AggregatedActivity timestamp and rearange the activity order

Considering group of following activities:

User C, B and A commented on Object (at time 12:35Pm) (consider A was first to comment , B was second and C was third user to comment)

Now User A commented again on Object (at time 2Pm) But the contains method marked it as duplicate so it is skipped but the aggregated activity timestamp is still 12:35Pm which is wrong it should be 2Pm now, no matter if the activity was duplicate. Also rearrange the AggregatedActivity.activities order so that the older activity of User A should now be most recent in that group.

Its logical :P isn't it?

Docs on filtering

feed = feed.filter(activity_id__gte=100)

is currently quite hard to find in the docs

Feedly must be installed with pip --pre

pip install feedly
Downloading/unpacking feedly
  Downloading feedly-0.9.32.zip (83kB): 83kB downloaded
  Running setup.py egg_info for package feedly

    no previously-included directories found matching 'pinterest_example'
Downloading/unpacking cqlengine==0.8.71 (from feedly)
  Downloading 3a3127ae3b9cc363f7a01d19bbc46ca08460b8c5 (58kB): 58kB downloaded
  Running setup.py egg_info for package cqlengine

Downloading/unpacking redis>=2.8.0 (from feedly)
  Downloading redis-2.8.0.tar.gz (286kB): 286kB downloaded
  Running setup.py egg_info for package redis

Requirement already satisfied (use --upgrade to upgrade): celery in /home/omer/.virtualenvs/restapi/lib/python2.7/site-packages (from feedly)
Downloading/unpacking cassandra-driver (from cqlengine==0.8.71->feedly)
  Could not find a version that satisfies the requirement cassandra-driver (from cqlengine==0.8.71->feedly) (from versions: 1.0.0b5, 1.0.0b6, 1.0.0b7)
Cleaning up...
No distributions matching the version for cassandra-driver (from cqlengine==0.8.71->feedly)

Since cassanda-driver has a beta versioning scheme feedly must be installed using pip install --pre feedly

aggregate

Dear,

I tested feedly.
Aggregate Feed should be by user or not ?
(It looks to be not for me)

Thanks a lot

Serialization Error / KeyError in get_hydrated (?)

I have an issue similar to that in issue #8 .

Im trying to make a REST service with this, I use django-rest-framework.

Seems adding activities to redis works, but when i try to fetch them back i will get this

KeyError: 13869294460000000000005005L

seems like the "L" is causing some sort of error ?

function calls and stack trace:
wishlists/views.py

feedly = UserWishFeedly()

//function for adding activities:
activity = Activity(wishlist.user, WishVerb, in_prod.id)
feed = UserWishFeed(wishlist.user.id)
feed.add(activity)

//test-function for getting the feed
def wish_feed(request,args,*kwargs):
feed = feedly.get_user_feed(request.user.id)
activities = feed[:10]
content = {"activities":activities}
json_data = json.dumps(content)
return Response(json_data)

File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 115, in get_response
response = callback(request, _callback_args, *_callback_kwargs)

File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/csrf.py", line 77, in wrapped_view
return view_func(_args, *_kwargs)

File "/usr/local/lib/python2.7/dist-packages/django/views/generic/base.py", line 68, in view
return self.dispatch(request, _args, *_kwargs)

File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/csrf.py", line 77, in wrapped_view
return view_func(_args, *_kwargs)

File "/usr/local/lib/python2.7/dist-packages/rest_framework/views.py", line 327, in dispatch
response = self.handle_exception(exc)

File "/usr/local/lib/python2.7/dist-packages/rest_framework/views.py", line 324, in dispatch
response = handler(request, _args, *_kwargs)

File "/usr/local/lib/python2.7/dist-packages/rest_framework/decorators.py", line 49, in handler
return func(_args, *_kwargs)

File "/usr/local/lib/python2.7/dist-packages/django/contrib/auth/decorators.py", line 25, in _wrapped_view
return view_func(request, _args, *_kwargs)

File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/views.py", line 437, in wish_feed
activities = feed[:10]

File "/usr/local/lib/python2.7/dist-packages/feedly/feeds/base.py", line 261, in get item
start, bound)

File "/usr/local/lib/python2.7/dist-packages/feedly/feeds/base.py", line 304, in get_activity_slice
activities = self.hydrate_activities(activities)

File "/usr/local/lib/python2.7/dist-packages/feedly/feeds/base.py", line 285, in hydrate_activities
return [activity.get_hydrated(activity_data) for activity in activities]

File "/usr/local/lib/python2.7/dist-packages/feedly/activity.py", line 41, in get_hydrated
activity = activities[int(self.serialization_id)]

KeyError: 13869294460000000000005005L

If i try redis-cli i get:

127.0.0.1:6379> zrange "feed:user:6" 0 25

  1. "13869294460000000000005005"

Im running python 2.7.3 and django 1.5.4
Im kinda stuck with this, hope someone can shed some light on it. Great project btw!!

get_slice_from_storage - NotImplementedError()

Hi,
I try to build a notification system and I want to use feedly for it.
I have created a simple demo to test how the library works. The use case looks like this:
love = Love()
feed = MyNotificationFeedly()
feed.add_love(love)

I have followed this example: https://feedly.readthedocs.org/en/latest/notification_systems.html

However add_love() throws an exception which is raised in feedly/storage/base.py:270

I have checked the code and it seems that the get_slice_from_storage() is not implemented. I was just wondering how does the notification system work on Fashiolista if a such simple demo is not working for me locally.
Could you please help me?

self.get_user_follower_ids expected a dict but tutorial says to use values_list

AttributeError                            Traceback (most recent call last)
<ipython-input-8-a737537230b1> in <module>()
----> 1 feed.add_user_activity(1, activity)

/home/omer/.virtualenvs/restapi/lib/python2.7/site-packages/feedly/feed_managers/base.pyc in add_user_activity(self, user_id, activity)
    139         operation_kwargs = dict(activities=[activity], trim=True)
    140 
--> 141         for priority_group, follower_ids in self.get_user_follower_ids(user_id=user_id).items():
    142             # create the fanout tasks
    143             for feed_class in self.feed_classes.values():

AttributeError: 'ValuesListQuerySet' object has no attribute 'items'

Can't find pinterest_example based on the instruction

Hi team,based on this instruction
From the root of the feedly project run:

vagrant up
vagrant provision
vagrant ssh
cd pinterest_example
python manage.py runserver 0:8000

I can't see to find the pinterest_example folder, why?

How to group activity by objects

Hello,

I'm follow this tutorial:
https://feedly.readthedocs.org/en/latest/notification_systems.html

I see this:
class MyAggregator(BaseAggregator):
'''
Aggregates based on the same verb and same time period
'''
def get_group(self, activity):
'''
Returns a group based on the day and verb
'''
verb = activity.verb.id
date = activity.time.date()
group = '%s-%s' % (verb, date)
return group

but I don't know how to group the feed by its object, instead of by verb and date

Thank you for your help!

Dismiss activity

Hi!

Does Feedly support dismiss activity? For example if I use Feedly for building a notification system somehow the system has to dismiss the notification that the user has already seen. Could you please show me an example for this action?

Thanks.

Documentation

First of all thanks for this project. It looks awesome. I have been using django-activity-stream in one of my project. Now i want to move to Feedly. But the documentation is not complete yet. When do you have plans to complete the documentation? Thanks

hget vs hmget

Why not use hmget instead of hget in feedly.structures.hash.RedisHashCache.get_many()?

An Example for filtering the Aggregated Activities?

How can i filter the aggregated activities? Considering aggregated activities has the custom attribute e.g. active (boolean)

feed = newsfeed_stream.get_feeds(user.id)['aggregated']
# filtering here ???

Could not find a version that satisfies the requirement cqlengine==0.8.71

I'm having some trouble installing feedly.

I'm running pip --version
pip 1.5 from /Library/Python/2.7/site-packages/pip-1.5-py2.7.egg (python 2.7)

pip install feedly --pre

gives me the following error:

Downloading feedly-0.9.32.zip (83kB): 83kB downloaded
Running setup.py (path:/private/var/folders/w2/zst4166d4nzgvpwhyk7qg66r0000gn/T/pip_build_tom/feedly/setup.py) egg_info for package feedly

no previously-included directories found matching 'pinterest_example'

Downloading/unpacking cqlengine==0.8.71 (from feedly)
Could not find a version that satisfies the requirement cqlengine==0.8.71 (from feedly) (from versions: 0.0.1-ALPHA, 0.0.2-ALPHA, 0.0.3-ALPHA, 0.0.4-ALPHA, 0.0.5, 0.0.6, 0.0.7, 0.0.8, 0.0.9, 0.1.1, 0.1.2, 0.1, 0.10.0, 0.2, 0.3.1, 0.3.2, 0.3.3, 0.3, 0.4.1, 0.4.10, 0.4.2, 0.4.3, 0.4.4, 0.4.5, 0.4.6, 0.4.7, 0.4.8, 0.4.9, 0.5.1, 0.5.2, 0.5, 0.6.0, 0.7.0, 0.7.1, 0.8.1, 0.8.2, 0.8.3, 0.8.4, 0.8.5, 0.8, 0.9.1, 0.9.2, 0.9)
Cleaning up...
No distributions matching the version for cqlengine==0.8.71 (from feedly)

Any help on this would be greatly appreciated.

get_slice_from_storage NotImplementedError

When I try to add an activity
feed.add(activity)

it raise NotImplementedError for this function.

def get_slice_from_storage(self, key, start, stop, filter_kwargs=None):
'''
:param key: the key at which the feed is stored
:param start: start
:param stop: stop
:returns list: Returns a list with tuples of key,value pairs
'''
raise NotImplementedError()

is there something I configured wrong ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.