Giter VIP home page Giter VIP logo

requestcontrol's Introduction

BuildStatus codecov

Request Control - Firefox extension

An extension to control HTTP requests. Provides front-end for Firefox webRequest.onBeforeRequest API for HTTP request management.

Requests can be controlled with following rules:

  • Filter Rule

    Skip URL redirection and remove URL query parameters.

  • Redirect Rule

    Rewrite requests with support for Pattern Capturing to redirect based on the original request.

  • Secure Rule

    Upgrade non-secure (HTTP) requests to secure (HTTPS).

  • Block Rule

    Block requests before they are made.

  • Whitelist Rule

    Whitelist requests from other rules.

Support

  • Report bugs
  • Suggest new features
  • Help to translate
  • Contribute

Development

Clone repository and setup development environment with npm

git clone https://github.com/tumpio/requestcontrol.git
cd requestcontrol
npm install

Run in Firefox-nightly

npm start -- --firefox=nightly

Run unit tests and lint

npm test ; npm run lint

Build extension

npm run build

External Libraries

Request Control uses the following external libraries:

  • lit is licensed under the MIT license.
  • tags-input and its fork by @pirxpilot are licensed under the MIT license.
  • ionicons is licensed under the MIT license.
  • tldts is licensed under the MIT license.

License

This Source Code Form is subject to the terms of the Mozilla Public
License, v. 2.0. If a copy of the MPL was not distributed with this
file, You can obtain one at http://mozilla.org/MPL/2.0/.

requestcontrol's People

Contributors

arenal5 avatar areyouloco avatar crystal-rainslide avatar gitoffthelawn avatar rusty-snake avatar salim-b avatar serverwentdown avatar strel avatar toothbrush avatar tumpio avatar yfdyh000 avatar zocker1999net avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

requestcontrol's Issues

Issue with double question mark rewrited to single question mark, but it should not

Expected behavior

Do nothing

Actual behavior

Strips one "?" where there are two "??"

Steps to reproduce the problem

Create rule:

[
  {
    "pattern": {
      "allUrls": true
    },
    "action": "filter",
    "active": true,
    "skipRedirectionFilter": true,
    "paramsFilter": {
      "values": [
        "xxx"
      ]
    }
  }
]

This rule should strip only parameter "xxx".

Visit https://www.aliexpress.com/item/Xiaomi-Mini-Router-2-4GHz-5GHz-Dual-Band-Max-1167Mbps-Support-Wifi-802-11ac-Xiaomi-Mi/32773978417.html

Every request that has double question mark like "??" is rewrited to single question mark like "?", which is wrong.

Trim params wildcard inconsistency

This had me confused for quite a while with parameters being trimmed that I didn't intend until I added debug logging manually to see what was going on. Basically when a filter rule's Trim URL Parameters value contains more than 1 asterisk, the first asterisk is replaced by the regex pattern .* but all the others are left alone.

Personally I'd prefer to just work with regex, but I understand the asterisk replacement is there so users don't have to know regex and to be somewhat consistent with the pattern matching for hosts/paths. It's a bit awkward that it's actually a regex except for the asterisk because it means you can't use an asterisk in any regex context other than .* (unless you abuse this bug and know that only the first asterisk works like that).

The obvious fix is to replace .replace("*", ".*") with .replace(/\*/g, ".*") so all asterisks are replaced, not just the first. A more complete solution might allow the user to deliberately indicate a particular parameter is a regex (eg /to_be_trim*ed/) whilst properly treating all others as literal strings, but that seems like a lot of work to get right and performant!

One guide to explain how to create new rules

Expected behavior

https://support.mozilla.org/en-US/kb/enable-drm
https://8pecxstudios.com/Forums/index.php
https://support.mozilla.org/en-US/kb/change-your-default-search-settings-firefox

Actual behavior

https://support.mozilla.org/en-US/kb/enable-drm?as=u&utm_source=inproduct
https://8pecxstudios.com/Forums/index.php?sid=3fec53ba003fec53ba00
https://support.mozilla.org/en-US/kb/change-your-default-search-settings-firefox?as=u&utm_source=inproduct

Hello,

it will easier and lot a better if you can povide an real one example. Then how sould I proceed to remove the UTM and sid (from those links) ?

Kind regards.

Case Sensitive problem

Hi @tumpio
At least "path" definition is case sensitive, don't know yet for other settings.
This is unpleasant situation. Can you make it case insensitive, please?

Thank you, cheers

Unable to trim URL parameters on youtube?

How my trim filter is set up:

Pattern
scheme http/https
host *.youtube.com *youtu.be
path *
Types Document sub document
Action Filter
Filter URL Redirection Off
trim URL parameters attribution_link feature app

Expected behavior

https://www.youtube.com/watch?v=yWtFGtIlzyQ&feature=em-uploademail should turn into https://www.youtube.com/watch?v=yWtFGtIlzyQ

Actual behavior

URL stays https://www.youtube.com/watch?v=yWtFGtIlzyQ&feature=em-uploademail

Steps to reproduce the problem

I should note that my other filters do work. This is the only one that doesn't seem to want to work.
Using Firefox 58b5

edit:

Hmm..Seems that none of my URL trimming rules work. I have utm_* and ref_* trimming for all URLs but non of them get trimmed.

Might have some problems with some Web Apps?

So I'm trying to use a filter rule for Facebook with these settings

host: *.facebook.com
path: *
action: filter
filterURL: On
Trim URL Parameters: Trim all

The problem is that when I navigate on to a page with a ref on facebook it seems to not be filtered (doesn't seem to do an page update event for the extension).

Ex. Entering someones profile via hovering over the their chat gives this link:
https://www.facebook.com/{UserProfile}?fref=hovercard&hc_location=chat

Target:
https://www.facebook.com/{UserProfile}

However, refreshing the page does do the filter so I know the rule is working.

Browser: FF 55
RC Version: 1.6.1

Thanks for this cool extension ๐Ÿ˜„

WebExtensions are inadequate to handle all http requests

provide visual indication of cleaned redirect

there seems to be no visual indication that the extension is working - for example, when clicking a URL from a Google search result, the redirect URL appears in the address bar before the destination URL loads, as if RC wasn't working at all

i had been using Clean Links and i very much liked that it provided a visual indication when it 'cleaned' a link by briefly changing the background color of the address bar and, optionally, by restyling the source link with CSS

FR: Visualize how the rule will work

Feature request:
Example URL and result visualization - this is the most important feature your add-on lacks and needs it badly.
When a user writes a new rule - he can not know if the rule he is writing is correct or not, whether it will match the example URL or not, is the result of the redirection going to be the desired one or not.
How to test all that?

Just take a look at how awesome it was done in Redirector:

It makes rule creation pretty much obvious, the regex pattern gets tested on the go as you type it and thus you can see if it matches the example URL and what the output URL is going to look like.

Protecting Request Control settings by password

Expected behavior

If some sites are blocked by rules, protecting the settings could prevent re-configuring the rules for unwanted outgoing request by curious user.

Actual behavior

No password is reqired.

For redirected top-level image pages, clicking Save Image still downloads the non-redirected image

Expected behavior

Please see the STR section.

Actual behavior

Steps to reproduce the problem

  • Open the following link: https://pbs.twimg.com/media/C_K3H21W0AAqbIp.jpg. Notice (by hovering over the tab title) that it is a 1200x877px image. Now right click the image > Save Image As. Notice that the downloaded image is 1200x877 too, as expected. Leave this tab open.
  • Now add the following rule to RequestControl:
    capture
    For easy copy-pasting: *.twimg.com media/* {href/(large|small|medium)/orig|/\.(png|jpg|jpeg|gif)$/.$1:orig|/name=[0-9]+x[0-9]+/name=orig}
  • Reload the tab opened in step1. It should get redirected to a URL with :orig in the end, which is now a 1606x1174 image. Now right click > Save Image As, Notice that the suggested file name to save is the same as step1. Choose a new location or a new name to avoid overwriting. Notice that this downloaded image is still 1200x877, the same size as step1.
  • Now while on the same tab as step3, press ctrl+L to select the address bar and press enter. The :orig parameter would remain and RequestControl wouldn't have to redirect anything. Right click > Save Image As. The suggested file name for saving the image is different from step1 and step3, and the image saved now is 1606x1174, as expected.

Reproducible with RequestControl v1.8.1, on all channels of Firefox, and with other top-level images as well (not just from twimg.com).

Request Control can't be install with CyberFox

Hello,

I just want to report that I had met one trouble to install (the addon) Request Control (v1.2.2)... Then, I had discover a few topics on the CyberFox Forum (like this one). To resume, apparently, for the browser the file (archive .xpi) is corrupted... and perhaps there's one solution which consist to modify the file manifest.json but after three attempts, it doesn't seem to work... and no, sorry I don't have any idea to fix the trouble.

And thank you for Request Control because it seem's to work fine with Firefox. Indeed, I was looking for one solution to avoid the redirect link (via the adress "outgoing.prod.mozaws.net") on the Mozilla Addon website.

I hope you will find one way to fix the trouble about Cyberfox (x64_v50.1.0).
Kind regards.

Whitelist mode - filter requests by default

Hi!

Thank you for creating this very interesting addon. I wonder if it's possible to add "whitelist mode", so instead filtering requests for only selected sites it would filter requests for all sites except specified list?

copy cleaned link

would be nice to be able to copy the cleaned link (after being processed by RC) to the clipboard

Highlink Cleaned Links

Thank you for making this add-on. Currently I'm using Clean Links but from the look of the support for e10s (Firefox) will be far away.

I only briefly tested this addon and it seem to be working so far with e10s enable.

The only issue I'm currently find this Add-on/Extension lacking is "visibility". With other add-on, usually link that got "clean" get highlight in a colors so it indicate visually.

With JustRedirect, I've no idea if it work or not. It would be good if "highlighting" feature is added or someway to indicate the link you just click was "cleaned".

Thank you.

Filter rule Trim All Except... enhancement suggestion

Could you add under filter rule "Trim All" also exception text box, where all is filtered except the parameters in the, currently not existent, "except" text box?
There are cases when I would like to trim everything except a few parameters.

Cheers

deviantart filter URL redirection trimming URL parameters

Trying to open the following link results in &oe=5A0F78B4 being trimmed even though the deviantart rule isn't set to trim anything, breaking the link. I've disabled all other rules to ensure they're not causing the problem.

http://www.deviantart.com/users/outgoing?https://scontent.ftpa1-1.fna.fbcdn.net/v/t1.0-9/19437615_10154946431942669_5896185388243732024_n.jpg?oh=f7eb69d10ee9217944c18955d3a631ad&oe=5A0F78B4

Firefox 54 on Linux Mint 17.1

Redirect does not work on addons.mozilla.org

Expected behavior

The redirect rule redirects from https://addons.mozilla.org/en-US/firefox/addon/requestcontrol/ to https://addons.mozilla.org/ru/firefox/addon/requestcontrol/.

Actual behavior

The redirect rule does not redirect from https://addons.mozilla.org/en-US/firefox/addon/requestcontrol/ to https://addons.mozilla.org/ru/firefox/addon/requestcontrol/.

Steps to reproduce the problem

  • Install Request Control.
  • Set russian locale, russian language.
  • Create redirect rule:
    ** Pattern:
    *** scheme: http/https
    *** host: addons.mozilla.org
    *** path: *
    ** Types: Document, Sub document, Image, Object, XMLHttpRequest, XML DTD, Font, Media, Imageset, Other
    ** Action: Redirect
    ** Redirect To: {origin}{pathname/^(\/)(?!(?:ru(?:-RU)??|firefox|addon|user)(?=$|[\/?&#]))[^\/?&#]+?((?=$|[\/?&#]).*?)$/$1ru$2}{search}{hash}
  • Open page https://addons.mozilla.org/en-US/firefox/addon/requestcontrol/
    The add-on will write that redirected, but does not redirect.

On other domains it works.

Filter breaks eBay view feedback pages

URLSearchParams.toString() causes http://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback&<other params> to be transformed into http://feedback.ebay.co.uk/ws/eBayISAPI.dll?ViewFeedback=&<other params>, which eBay doesn't understand and redirects to an error page. Potentially other parts of eBay are also broken, but this is the first instance I encountered.

I can't work around this with a redirect rule because redirects are applied before filters, so I'm stuck with whitelisting it (since the filter causing it is for <all_urls>). I'm not sure if there would be any side-effects to either processing redirects after filters or simply URLSearchParams.toString().replace(/=&/g, '&'), or if there's a better solution.

URL redirection not working properly with Gmail and Youtube?

Expected behavior

When clicking on a youtube link from Gmail (from an upload notification) the youtube page should load properly

Actual behavior

When clicking on a youtube link from Gmail (from an upload notification) you end up with a 404 Not Found

Steps to reproduce the problem

*Go to Gmail
*Click on an upload notification mail and open the video from the link
*You get an 404 Not Found error

This seems to be cause by URLs not being decoded properly? The target url contains text with %3D ( =-sign) and and it seems to be double escaping characters with %253F (?-sign)

Can someone confirm this issue for me. It just might be on my end. Disabling the Google filter solves the issues for me tho.

Additional info

https://www.google.com/url?hl=en-GB&q=https://github.com/tumpio/requestcontrol/issues/39%23issuecomment-320343846&source=gmail

Even this link gave issues, it should redirect to the comment from @Atavic

Expected URL
https://github.com/tumpio/requestcontrol/issues/39#issuecomment-320343846
Actual URL
https://github.com/tumpio/requestcontrol/issues/39%23issuecomment-320343846

More options for URL filtering/cleaning

Hello.
I was looking for Clean Links replacement and found this great extension. But I think it can be improved in terms of link cleaning. A few suggestions:

  • Global URL parameters filtering (for any url, even without rules). At the moment parameters are only removed from requests that match any "Filter" rule . I tried to add a global filter rule with Pattern: Any URL, but it doesn't work;
  • Two global whitelists for hosts and paths or even better -> see the next paragraph;
  • Ability to use multiple hosts and paths in rules. And instead of 'Include subdomains' checkbox, just use wildcard *. Often redirection/affiliate links use identical schemes (?r=, ?url=), so creating separate rules for each host is excessive. Also one could create a whitelist rule with multiple hosts - no need for a separate global whitelist;
  • Wildcard support for URL parameters (like utm_*, ref_*). Not really necessary though.

Here is a concept: https://dl.dropboxusercontent.com/s/h131h58h6aoiqc4/requestcontrol2.png

Also a question. Which regex flavor is used in this extension? Javascript? I'm trying to create a redirection rule to get direct links to Dropbox, but I can't get it to work. Here is the rule: {href/.*www\.dropbox\.com(\/s\/.+\/.+)\?dl=\d/https://dl.dropboxusercontent.com$1}
Test link: https://www.dropbox.com/s/h131h58h6aoiqc4/requestcontrol2.png?dl=0

--
To do list for these:

  • Fix url parameter filtering for global rules.
  • Fix redirection url parsing from a query string (remove trailing query parameters).
  • Change global url parameter trimming to Filter rule specific.
  • Add a button to toggle redirection cleaning for "Filter" action.
  • Add Wildcard support for URL parameters (like utm_, ref_).
  • Use only * instead of 'Include subdomains' checkbox.
  • Add support for including multiple match patterns for rules.

note the rules priority issue.

"Cache" problem or?

To reproduce:
Create rules no.1 and no.3 from this issue: #22

Now visit https://www.kickstarter.com/, scroll a bit down and you will see images of projects, where rule no.3 is whitelisting the rule no.1. Thats OK.

Now disable rule no.3 and reload the page, where you will see that those images are gone/blocked. Thats OK, since rule no.1 is preventing them to show and whitelisting rule no.3 is disabled.

Now reneable whitelisting rule no.3 and reload the page again and see the result. RC shows that those images are whitelisted, but some are missing and will persist so whatever you do until FF restart.

O god, I hope this make some sense. If not clear what I mean, let me know. English is not my mother language.

[Feature] [Request] Comment or Description in Rules

This might be a obsolete feature or not something that will be added but I will give it a try.

Currently the (my) rules list is quite small so it still easy to remember what each one does and it purpose. But once it get longer (or when I start using other's people rules via import/export that they shared) it will be hard to know what the purpose or this rule.

So can a Comments or more specifically a Description/Naming field be add to accommodate this?

Thank you.

Non-standard URLs with multiple question marks are not processed

When deleting parameters from a URL, non-standard URLs (multiple question marks) should also be processed.

Expected behavior

The filter rule redirects from http://example.test/path?parameter&utm_source&key=value?utm_medium=abc&parameter&utm_term?key=value&utm_medium=abc#utm_content=utm_campaign?utm_reader=utm_place&utm_reader=utm_place to http://example.test/path?parameter&key=value&parameter?key=value#utm_content=utm_campaign?utm_reader=utm_place&utm_reader=utm_place.

Actual behavior

The filter rule redirects from http://example.test/path?parameter&utm_source&key=value?utm_medium=abc&parameter&utm_term?key=value&utm_medium=abc#utm_content=utm_campaign?utm_reader=utm_place&utm_reader=utm_place to http://example.test/path?parameter&key=value?utm_medium1=abc&parameter&utm_term1?key=value#utm_content1=utm_campaign1?utm_reader1=utm_place1&utm_reader1=utm_place1.

Steps to reproduce the problem

  • Install Request Control.
  • Create filter rule:
    ** Pattern: Any URL
    ** Types: Any type
    ** Action: Filter
    ** Filter URL Redirection: Off
    ** Trim URL Parameters: utm_source, utm_medium, utm_term, utm_content, utm_campaign, utm_reader, utm_place
  • Open page http://example.test/path?parameter&utm_source&key=value?utm_medium=abc&parameter&utm_term?key=value&utm_medium=abc#utm_content=utm_campaign?utm_reader=utm_place&utm_reader=utm_place

On the Internet, there are pages with links in this way.

In rules in regular expressions, the repetition construct {n,m} is not supported

Expected behavior

The redirect rule redirects from https://example.com/?1234 to https://example.com/.

Actual behavior

The redirect rule redirects from https://example.com/?1234 to https://example.com/?1234?(?=$|[?#])|([?&])\d{4,}?[?&])/$1}, then to https://example.com/?1234?(?=$|[??(?=$|[?#])|([?&])\d{4,}?[?&])/$1}#])|([?&])\d{4,}?[?&])/$1}, then to https://example.com/?1234?(?=$|[??(?=$|[??(?=$|[?#])|([?&])\d{4,}?[?&])/$1}#])|([?&])\d{4,}?[?&])/$1}#])|([?&])\d{4,}?[?&])/$1}โ€ฆ

Steps to reproduce the problem

  • Install Request Control.
  • Create redirect rule:
    ** Pattern:
    *** scheme: http/https
    *** host: example.com
    *** path: *
    ** Types: Document
    ** Action: Redirect
    ** Redirect To: {origin}{pathname}{search/(?:[?&]\d{4,}?(?=$|[?#])|([?&])\d{4,}?[?&])/$1}{hash}
  • Open page https://example.com/?1234
    The addon will redirect to the wrong page and will continue to redirect to a stop by the user.
    Escaping } does not work. Only rewriting an expression without {n, m} helps, but the expression becomes large and complex.

FR: provide a way to reference the text captured by the * in the pattern

Feature request:
Please, provide a way to reference the text from the source URL, captured by the * in the pattern.

Say, I want to redirect this URL:

https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fvid.me%2Fe%2FiVeR9%3Fcard%3D1&amp;url=https%3A%2F%2Fvid.me%2FiVeR9&amp;image=https%3A%2F%2Fd1wst0behutosd.cloudfront.net%2Fvideos%2F16391778%2Fthumb.jpg%3Fv2r1500463663&amp;key=2aa3c4d5f3de4f5b9120b660ad850dc9&amp;type=text%2Fhtml&amp;schema=vid

into this:

https://vid.me/embedded/iVeR9?autoplay=1

How would I do that?

As you can see, the important string (FiVeR9) is only a PART (not the whole value) of param 'src' (or 'url') from the query of the initial URL.
So I can't reference to the param 'src' as it will return also the part I don't need at all.

If your add-on would save the text that got captured by the * in the pattern and users could reference it - that problem could be solved.

Privacy Policy

Hey @tumpio : )

One thing I am in the process of doing with the ghacks user.js wiki page for extensions is to add Privacy Policy links (like below). If an extension has one that respects privacy, awesome, if not, no such "badge" (and if it has one one that stinks, it will not be recommended).

Would you consider creating a simple Privacy Policy wiki page or root md file? Cheers

Edit: or you can just add it to AMO (not sure how you apply that per release)

Options cannot be accessed in Fennec (Firefox mobile)

Reported by U64, March 24, 2017

The missing star is because of missing functionality in Fennec;you can't access the rules ad such (but they do work, along with the cleaning option). Perhaps adding a button that lets us open up the extension's settings (again, like uBlock Origin) is in order?

More information in location-bar icon

The Request Control icon appears in the location bar when a request has been filtered.โบ Clicking on it shows a popup with the same message as the tooltip, "Request was filtered." It'd be neat if the popup had some more detail. Specifically, it'd be neat to see what the original URL was, and the actual URL that was sent.

โบ So far, filtering is the only action I've experimented with, but I'm sure other actions could show useful content in the popup bubble too.

It would also be cool if clicking in the popup would open a new tab to the Request Control Rules page, ideally with the triggered filter pre-opened. That would be helpful for troubleshooting if a filter isn't working quite right.

only works per-tab?

from the description...

If the matched request URL contains another URL, the request is cancelled and the tab where the request was made is navigated to the contained URL.

if i understand this correctly, it seems that if, for example, you middle-click on a Google search result to open the link in a new tab, RC won't clean the redirect - is this correct?

Regex repetition quantifier not supported in pattern captures

Regex repetition quantifiers {n[,n]} creates unexpected results.

This doesn't work:
{pathname/l\.([a-zA-Z]{3,4})$/.$1}

While this works as expected:
{pathname/l\.([a-zA-Z][a-zA-Z][a-zA-Z][a-zA-Z]?)$/.$1}

Steps to reproduce the problem

  • Create new rule with the following properties:
Property Value
Pattern http/https :// i.imgur.com / *l.*
Types Any type
Action Redirect
Redirect to: {origin}{pathname/l\.([a-zA-Z]{3,4})$/.$1}{search}{hash}

Example url:
https://i.imgur.com/cijC2a2l.jpg

Expected behavior

Redirect to:
https://i.imgur.com/cijC2a2.jpg

Actual behavior

Redirect (infinite) to:
https://i.imgur.com/cijC2a2l.jpg)$/.$1%7D
https://i.imgur.com/cijC2a2l.jpg)$/.$1%7D)$/.$1%7D
https://i.imgur.com/cijC2a2l.jpg)$/.$1%7D)$/.$1%7D)$/.$1%7D
...
and so on

(the infinite redirects are not part of this issue)

Not all whitelists are processed

Create following rules... order of creation is important:
1.
Pattern: Any
Types: Image
Action: Filter
Filter URL: Off
Trim URL Parameters: Trim all

Pattern scheme: http/https
Pattern host: *
Pattern path: test*,xxxxx* (actually just fill something)
Types: Any type
Action: Whitelist

Pattern scheme: http/https
Pattern host: ksr-ugc.imgix.net
Pattern path: *
Types: Image
Action: Whitelist

Now visit https://www.kickstarter.com/, scroll a bit down and you will see missing images, which were junked by rule 1.
But, since we have rule 3, those images should have been whitelisted.
If we disable rule 2, then rule 3 kicks in and images are shown as they should (OK some does not, I believe because of cache or something... but this should be covered in another topic... if Firefox is restarted then it works).
So, I assume that when whitelist rule 2 is processed (doesn't matter if matched or not) then whitelist rule 3 is not processed but should have been.

NOTE: I am using RC 1.7.0beta3

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.