probot / probot Goto Github PK
View Code? Open in Web Editor NEW🤖 A framework for building GitHub Apps to automate and improve your workflow
Home Page: https://probot.github.io
License: ISC License
🤖 A framework for building GitHub Apps to automate and improve your workflow
Home Page: https://probot.github.io
License: ISC License
Probot failed to start using the default private-key filename style from GitHub. Probot successfully started after renaming the key. See the before and after below:
Jamies-MacBook-Pro-2:probot-freeze jbjonesjr$ npm start --private-key probot-freeze.2017-04-07.private-key.pem
> [email protected] start /Users/jbjonesjr/github/probot/probot-freeze
> probot run ./index.js "probot-freeze.2017-04-07.private-key.pem"
Missing GitHub Integration private key.
Use --private-key flag or set PRIVATE_KEY environment variable.
Jamies-MacBook-Pro-2:probot-freeze jbjonesjr$ npm start --private-key private-key.pem
> [email protected] start /Users/jbjonesjr/github/probot/probot-freeze
> probot run ./index.js "private-key.pem"
When a pull request is merged, add a line to CHANGELOG.md
or HISTORY.md
, with the PR title, number, and if the contributor is not a maintainer, give them props, optionally categorizing the updates based on tags.
@jekyllbot does this (with a slightly different behavior that I don't like), but the end result could look somethign like this:
## Head
### Minor
* Make the button red (#123, props @bkeepers)
### Major
* Button is now a switch (#456)
### Development
* Add Rubocop (#789)
There is absolutely no error handling right now, and there are at least a few kinds of errors that are already common. Promises are heavily used already, so it should be pretty straight forward to hook in some kind of proper error handling.
Since people will be able to write their own behaviors as node modules (#4), this bot should be dead simple for anyone deploy and add their behaviors.
A Deploy to Heroku button would be ideal. There will also likely be some configuration depending on how the GitHub integration works (#5).
When there is a push with changes to the .probot
file, it should validate the syntax of the change and call the status API with the result.
PRobot currently only dispatches webhook events that have a repository
in the payload. There are several events (like organization, membership, and team) that aren't associated with an organization.
@Migaro and I talked briefly about this, and one option for events without a repository is to look for a repo named something like probot-scripts
in the organization, and evaluate the configuration from that repository.
Somewhat related to #15, and a behavior @jekyllbot does that really helps, is to automatically create a new release, based on the contents of the CHANGELOG, whenever a new tag (maybe matching a vN.N.N
format) is pushed. You can see an example of it Jekylls Sitemap
@sridharavinash suggested that it'd be nice to have probot install plugin-name
, which will:
npm install --save probot-plugin-name
plugin-name
to plugins
in package.json
Configuring the GitHub Integration is the most tedious part of deploying a plugin right now. Ideally, plugins could declare what permissions are needed, and then there would be a command (e.g. probot setup
) that configures a new Integration.
cc @tarebyte based on our previous discussion of this
Here are a list of actions that need implemented. If anyone wants to jump in, these are pretty easy to implement. Check out the docs for adding an action or one of the merged PRs listed below as an example.
Including configuration from another repository will be incredibly useful for a couple use cases:
The include feature should work for:
.probot.yml
)All of the documentation for available actions is manually crafted right now. It should be generated from source.
Something like how remark-lint generates docs/rules.md.
Acknowledging contributions in a timely matter and setting expectations for response is one of the most important things you can do to keep a contributor engaged. Based on recent activity, a bot should be able to let contributors know when they can expect to receive a response. This could maybe be added to autoresponder, or as a separate behavior.
Configuration:
Thanks for contributing! Based on recent activity, expect a response in {{ time_to_response }}.
Examples:
include
and contents
currently default to the repository that the event was triggered on, not the repository that the configuration was loaded from.
Given this config in a repo called this/repo
:
include("other/repo:config.js");
And config.js
in other/repo
contains:
include('issue-triage.js');
on("issues.opened").comment(contents(".github/ISSUE_REPLY_TEMPLATE.md"));
This should look for issue-triage.js
and .github/ISSUE_REPLY_TEMPLATE.md
in other/repo
, but both currently look in this/repo
. include
and contents
should be relative to the repository they are being loaded from, not the repo that the event was triggered on.
The configuration syntax is pretty ugly. It'll work for some early prototyping, but it would be great to come up with something better.
Some kind of real grammar could be interesting. For example here's what we have now:
behaviors:
# Auto-respond to new issues and pull requests
- on:
- issues.opened
- pull_request.opened
then:
comment: "Thanks for your contribution! Expect a reply within 48 hours."
label: triage
# Auto-close new pull requests
- on: pull_request.opened
then:
comment: "Sorry @{{ user.login }}, pull requests are not accepted on this repository."
close: true
# Perform actions based on content of comments
- on: issue_comment.opened
when:
payload:
issue.body:
matches: /^@probot assign @(\w+)$/
then:
assign: {{ matches[0] }}
- on: issue_comment.opened
when:
payload:
issue.body:
matches: /^@probot label @(\w+)$/
then:
label: {{ matches[0] }}
would become:
on issues.opened
then comment("Thanks for your contribution! Expect a reply within 48 hours.")
and label("triage");
on pull_request.opened
then comment("Sorry @{{ user.login }}, pull requests are not accepted on this repository.")
and close;
on issue_comment.created
when comment.body matches(/^@probot assign @(\w+)$/)
then assign(matches[0])
on issue_comment.created
when comment.body matches(/^@probot label @(\w+)$/)
then label(matches[0])
I have no idea how it would work in practice though.
mention-bot, which @mentions potential reviewers based on recent git history, works as a node module and should be relatively straight forward to turn it into a behavior.
Rules should be filterable by attributes of the payload.
Here are a few examples that were added in #13, which will likely changed based on implementation:
# Misc examples
when:
payload:
sender.login: bkeepers
issue.title:
contains: "[WIP]"
issue.body:
matches: /^$/
issue.labels
contains: "bug"
# Close issues with no body
- on:
- issues.opened
when:
payload:
body:
matches: /^$/
then:
comment: "Hey @{{ user.login }}, you didn't include a description of the problem, so we're closing this issue."
# Perform actions based on content of comments
- on: issue_comment.opened
when:
payload:
issue.body:
matches: /^@probot assign @(\w+)$/
then:
assign: {{ matches[0] }}
- on: issue_comment.opened
when:
payload:
issue.body:
matches: /^@probot label @(\w+)$/
then:
label: {{ matches[0] }}
# Close issues that don't use the issue template
- on: issues.opened
when:
- payload:
issue.body:
not:
contains: /### Prerequisites.*### Description.*### Steps to Reproduce.*### Versions/
then:
comment:
from_file: .github/MISSING_ISSUE_TEMPLATE_AUTOREPLY.md
label: insufficient-info
close: true
I still think PRobot is a good idea, but I'm going to be honest and say that it really sucks right now. You can't do much with the current feature set, and implementing new features is hard and confusing.
After talking with @groundwater, I would like to take PRobot in a very different direction for a while. Feedback on this would be appreciated.
Thus far the aim has been to expose useful GitHub APIs through a jQuery-style API that is safe-ish to run on a single hosted instance. Eventually, I think that is the right idea, but right now, it's a premature optimization. It makes it much less useful because people can only use PRobot for things have been implemented in this custom API.
So I'd like to remove the existing API right now, focus on building a good stand-alone bot framework that can make use of any existing node libraries, and then later figure out how to provide a simple API and run it on a hosted instance.
Here's basically the new proposed roadmap:
This will make it really easy to create new bots by abstracting the details of handling webhooks and building a GitHub Integration.
For example, here is an autoresponder module that comments on opened issues with the template in github/AUTOREPLY_TEMPLATE.md
:
// Export a function that takes `robot` as an argument
module.exports = function(robot) {
// `robot.on` will listen for any GitHub webhook events
robot.on('issues.opened', async function(event, context) {
// `context` extracts information from the event, which can be passed to
// GitHub API calls.
const options = context.repo({path: '.github/AUTOREPLY_TEMPLATE.md'});
// This returns:
// {owner: 'yourname', repo: 'yourrepo', path: '.github/AUTOREPLY_TEMPLATE.md'}
// `context.github` is an instance of the NodeJS wrapper for the GitHub API
// https://mikedeboer.github.io/node-github/
const data = await context.github.repos.getContent(options);
const template = new Buffer(data.content, 'base64').toString();
return context.github.issues.createComment(context.issue({body: template}));
});
}
This bot would be run locally with the probot
command, which will start up a server and listen for GitHub webhooks:
$ probot autoresponder.js
Listening on http://localhost:5000
Here's a more sophisticated bot based on github-configurer, which will update GitHub settings for a repository whenever .github/config.yml
is modified.
module.exports = robot => robot.on('push', push);
const configurer = require('github-configurer');
const FILE_NAME = '.github/config.yml';
async function push(event, context) {
const payload = event.payload;
const defaultBranch = payload.ref === 'refs/heads/' + payload.repository.default_branch;
const configModified = payload.commits.find(commit => {
return commit.added.includes(FILE_NAME) || commit.modified.includes(FILE_NAME);
});
if (defaultBranch && configModified) {
const options = {
owner: payload.repository.owner.name,
repo: payload.repository.name,
path: FILE_NAME
};
const data = await context.github.repos.getContent(options);
const content = new Buffer(data.content, 'base64').toString();
return configurer(context.github, options, content).update();
}
}
These examples can be published as behaviors in NPM modules (e.g. probot-autoresponder
and probot-configurer
) and deployed as stand-alone bots, or combined into one instance.
For example, if you wanted to deploy a bot for your project that included both of these behaviors, you could just create a new app that has them both listed as dependencies in package.json
:
{
"name": "my-probot",
"priate": true,
"dependencies": {
"probot-autoresponder": "~1.0",
"probot-configurer": "~1.0",
},
"scripts": {
"start": "probot"
}
}
Running the $ npm start
on this app would start up a bot that included both of these behaviors.
This approach creates a ton of flexibility. The core of probot
will handle the GitHub Integration and provide some basic services. Behaviors will be relatively easy to implement and reuse, and more importantly they'll be free to experiment and explore new patterns.
Later we can explore the idea of providing a single hosted integration and allowing people to run their probot behaviors directly from their repository via a .probot.js
file.
Finally, after there is some adoption and people are writing a lot of behaviors, I'd like to revisit the idea of exposing a friendlier and more expressive API that allows combining common actions in a chainable manner like we currently have:
on('issues.labeled')
.filter(event => event.payload.label.name == 'plugins')
.comment('Hey @jekyll/plugins, the `plugins` label was added');
Thoughts?
It would be nice if Probot updated my dependencies and lock file for me, so I didn't have to.
Our current implementation is https://github.com/wp-cli/wp-cli/pull/3994/files
It produces a pull request like wp-cli/wp-cli#3993
Other languages have services for this (e.g. Greenkeeper), but PHP does not :( Can't be too hard, can it?
Having access to the GitHub API via the Context
object is necessary to implement more advanced filters, specifically things like "is the last person who commented a member of the organization?" or "is the author of the issue or PR a member of the organization?"
Probably the best solution should be to pass the Context
object as the only parameter. This way examples can easily be changed from:
on('issues.opened')
.filter((event) => {
return !event.issue.body.match(/### Steps to Reproduce/)
|| event.issue.body.includes('- [ ]')
})
.comment(contents('.github/MISSING_ISSUE_TEMPLATE_AUTOREPLY.md'))
.label('insufficient-info')
.close();
to:
on('issues.opened')
.filter(({event}) => {
return !event.issue.body.match(/### Steps to Reproduce/)
|| event.issue.body.includes('- [ ]')
})
.comment(contents('.github/MISSING_ISSUE_TEMPLATE_AUTOREPLY.md'))
.label('insufficient-info')
.close();
Before the first release, there should be a clearly defined plugin API and easy way to create new behaviors as node packages. The plugin API should have convenience methods for performing common actions on the item that triggered the event (e.g. comment on a PR, update statuses, apply labels, etc.).
The naive example in the README right now is something like this:
robot.on('pull', function(event) {
robot.comment("Thanks for the pull request! We'll review it within 72 hours!");
});
The first behavior is looking something like this:
module.exports = {
webhook: 'issues', // name of webhook event
action: function(event, github) {
// event - the webhook event
// github - API client: https://mikedeboer.github.io/node-github/
}
};
I'm planning to implement a few other behaviors before figuring out what the API should look like.
It currently autoresponds on any issue action. It should probably be limited to the opened
action. Eventually it should probably be configurable, and allow different templates for each action.
Figuring out what is going on when things break is a bit of a disaster right now. It would be great to have a verbose logging mode to be able to see what is going on in development.
/cc @seemakamath
The Atom team has over 190 public repositories. Manually keeping all of the standard labels, templates, Codes of Conduct, Contributing guides, etc in sync across all of them is a Sisyphean task.
It would be great if PRobot could handle this for us.
@jonico made a great suggestion to default to ignoring events triggered by the bot to avoid the possibilities of infinite loops.
It would be the equivalent of adding a default filter on all workflows:
on('issues.opened')
// ignore events from the demo installation
.filter(event => event.payload.sender.login != 'probot-demo[bot]' )
// or just ignore all bots
.filter(event => event.payload.sender.type != 'Bot' )
.close();
I haven't thought much about configuration, but here are a few options:
One option is to have a configure
function for setting defaults, and allow the on
listener to take those options for overriding on specific workflows:
configure({ignoreOwnEvents: false});
// Uses default configuration
on('issue_comment.create').close();
// Override default configuration
on('issue_comment.create', {ignoreOwnEvents: true}).close();
Another approach could be like "middleware":
use(filter.ignoreOwnEvents);
// Default, which
on('issue_comment.create').close();
// Disable middleware for a specific workflow
on('issue_comment.create', {"filter.ignoreOwnEvents": false}).
We want to make it super-simple to test Probot plugins. To that end, there are some things that we would like to provide:
Configuration:
Examples:
Currently @
is used to access payload values:
if @pull_request.title contains "WIP"
Since @
is used for @mentions on GitHub, I imagine there will be use cases later on where the grammar should handle user and team mentions.
if $pull_request.title contains "WIP"
if {{pull_request.title}} contains "WIP"
- This is ugly, but mustache syntax can already be used in string templates, so consistency might be nice.When a label is applied, this behavior should look up watchers, which includes users or teams and the labels they are watching, and post a comment that @mentions them:
Configuration:
Hey {{ watchers }}, this was labeled \
{{ label }}.
Examples:
I hear from @janl and @gr2m that https://zeit.co/now is where it's at for deploying node applications.
In addition to centralizing deployment docs in #93, it should include some instructions for deploying to zeit.
This right now depends on having a GITHUB_TOKEN
environment variable, and every action is performed as that user.
This bot could be a really good candidate for Integrations. I started experimenting with implementing it as an integration, but integrations currently require storing the state for each installation, and there's not currently persistence.
I punted on this in #46 because I couldn't decide on syntax. Hera are a few options:
not(@issue.body contains "- [ ]")
not
prefix for each existing operator: @issue.body does not contain "foo"
, @sender.login is not "bkeepers"
, @issue.body does not match "regex"
Steps:
script/server
with valid GITHUB_TOKEN
passed.Behavior:
localhost crashes with the following output:
> [email protected] start C:\Users\Serge\Projects\PRobot
> node server.js
Listening on http://localhost:3000
events.js:160
throw er; // Unhandled 'error' event
^
Error: X-Hub-Signature does not match blob signature
at hasError (C:\Users\Serge\Projects\PRobot\node_modules\github-webhook-handler\github-webhook-handler.js:46:17)
at BufferList._callback (C:\Users\Serge\Projects\PRobot\node_modules\github-webhook-handler\github-webhook-handler.js:77:16)
at BufferList.end (C:\Users\Serge\Projects\PRobot\node_modules\bl\bl.js:98:10)
at IncomingMessage.onend (_stream_readable.js:511:10)
at IncomingMessage.g (events.js:291:16)
at emitNone (events.js:86:13)
at IncomingMessage.emit (events.js:185:7)
at endReadableNT (_stream_readable.js:974:12)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickCallback (internal/process/next_tick.js:98:9)
As I am new to the API in general I am likely missing an obvious step to setup my test environment.
A bunch of the examples either have syntax errors or use features that are not implemented. Ideally those would be separated into examples that work and examples that are aspirational (bonus points for tests that check that it is at least is valid syntax).
I'm looking at a use case where I want to make some kind of web panel for managing plugins, in my mind this would be exposed on localhost:3000/panel
. To do this I need access to the internal HTTP server so I can add a new listener on a given path.
Not sure if this is in scope for ProBot but my overall goal is to provide another module probot-swarm
which allows you to run multiple plugins in one instance and each plugin can expose settings which will be controlled by the web panel. So things like response messages can be configured easily without rebuilding the plugin source.
Thanks to @paulcbetts' suggestion and @mcolyer's work, PRobot is now configured via JavaScript. Before investing too much in this direction, I'd like to gain confidence that this approach can safely run un-trusted scripts.
For background on this project: this is a bot that anyone can install on their repository. It will read the .probot.js
file in the repository to configure workflows that trigger in response to webhooks. Those workflow include actions on the repository, such as commenting, managing labels and assignees, or eventually anythign you can do through the GitHub API.
The current implementation uses Node's vm API to create a new context. At the moment, the sandbox only exposes an on
method, which creates a new Workflow. The functions on workflow mostly just mutate data, which is used afterward to determinee behavior.
A few questions to start:
cc @mcolyer @paulcbetts @zeke @nathansobo
This will make it easy to create a new plugin by cloning https://github.com/probot/plugin-template and filling in the config variables.
I really like how clean/simple https://github.com/vuejs/vue-cli is, which might be a good place to start.
Hey I just noticed dotenv
in this PR: #36
I've had good luck with dotenv-safe which works just like dotenv
, but also checks that everything in your .env
is present in your .env.example
When deploying multiple plugins in one instance, it might be nice to just auto-discover plugins from node modules.
One idea for how this would work would be to find any plugins with the keyword probot-plugin
in their package.json
.
This could make #97 easier, or negate the need for it entirely.
Right now each of the plugins have their own deployment instructions. It would be ideal if they all just linked to one set of instructions for deploying a plugin in: https://github.com/probot/probot/blob/master/docs/deployment.md
The only difference between each plugin is the permissions it needs configured for GitHub Integrations.
I am facing a problem with the getContent
method which is supposed to fetch a .probot.yml
file from my test repo:
github.repos.getContent()
is fed the right owner
and repo
parameters from repository.full_name.split()
, but once I create an issue, the bot crashes with the following log:
`C:\Users\Serge\Projects\PRobot\lib\configuration.js:11
}).then(data => {
^
TypeError: Cannot read property 'then' of undefined
at Function.load (C:\Users\Serge\Projects\PRobot\lib\configuration.js:11:7)
at EventEmitter.webhook.on.event (C:\Users\Serge\Projects\PRobot\server.js:33:26)
at emitOne (events.js:96:13)
at EventEmitter.emit (events.js:188:7)
at BufferList._callback (C:\Users\Serge\Projects\PRobot\node_modules\github-webhook-handler\github-webhook-handler.js:98:15)
at BufferList.end (C:\Users\Serge\Projects\PRobot\node_modules\bl\bl.js:98:10)
at IncomingMessage.onend (_stream_readable.js:511:10)
at IncomingMessage.g (events.js:291:16)
at emitNone (events.js:86:13)
at IncomingMessage.emit (events.js:185:7)`
I am not sure whether the problem is with the token I have generated and that I am feeding to the bot as an environment variable (which on the github settings page says "last used: never"), or if the problem has something to do with the configuration.js file.
Keeping the content for actions in the repository will make it easier for maintainers to evolve them, will make the bot config easier to read, and will make it easier for behaviors to be reused.
At the moment, comment
is the only action that can use this, but there may eventually be others.
Here's an example of how this could work:
- then:
comment:
from_file: ".github/REPLY_TEMPLATE.md"
Some of the most important automation is not triggered by activity, but it is inactivity. This bot should have a way to do work on regular intervals or at some predetermined time in the future.
These would be behaviors that run at regular intervals and do regular maintenance. The only difference between these and the existing behaviors is that they're triggered by a timer and not a an action.
Examples:
These are one-off actions that just need to happen at some predetermined time in the future.
Examples:
@probot remind me to check this out in 3 weeks
Know of any prior art in bots that have good patterns for these?
Another behavior of @jekyllbot (that's relatively new), when an issue comes in and @mentions an affinity team, one of the team captains are randomly assigned the issue. They're obviously free to unassigned or change assignment, but it creates a sense of distributed ownership of issues, based on where they are in the codebase.
I'm not sure if this is a bug or if it is expected behavior.
If you set PRIVATE_KEY
in your environment, it had better be the key itself instead of the path to the key file, or SSL will bonk:
(node:44369) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 3): Error: error:0906D06C:PEM routines:PEM_read_bio:no start line
Heroku supports it now and would allow us to use async / await
cc @lee-dohm
From pegjs-util:
This is a small utility class for the excellent PEG.js parser generator which wraps around PEG.js's central
parse
function and provides three distinct convenience features: Parser Tree Token Unrolling, Abstract Syntax Tree Node Generation and Cooked Error Reporting.
The grammar is currently doing token unrolling and AST generation manually right now, and I've already run into issues with error reporting for syntax errors.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.