tholian-network / stealth Goto Github PK
View Code? Open in Web Editor NEW:rocket: Stealth - Secure, Peer-to-Peer, Private and Automateable Web Browser/Scraper/Proxy
License: GNU General Public License v3.0
:rocket: Stealth - Secure, Peer-to-Peer, Private and Automateable Web Browser/Scraper/Proxy
License: GNU General Public License v3.0
In order to get Stealth to work on the Pinephone, it's best to start with a PKGBUILD
file for Stealth that can be used for Arch Linux users.
package/
folder and a make.mjs
.make.mjs archlinux
should create the final .tar.xz
package.The SOCKS Protocol currently only supports a client-side connection. A really nice featureset would be a handle_socks
method in the Server that could allow usage via SOCKS proxy in the internal network - so that other Browsers could reuse Stealth as a SOCKS Proxy.
Currently, the featureset of the SOCKS protocol is a bit unclear, authentication is pretty much out of the question (as SOCKS is unencrypted) - therefore the Peers settings has to be respected. If a Peer is allowed to connect to the local machine, the same setting will decide on whether or not a SOCKS client is allowed to connect to it.
It might make sense to implement a SOCKS.upgrade()
method to upgrade raw connections - so that handshakes are abstracted away in a similar manner as they are for WS/S.upgrade()
.
As the SOCKS client has some buffers that need to be exchanged (varying amount of data transmitted dependent on network flow), it might make sense to also abstract away everything using once('data')
event listeners.
The SOCKS Protocol should use the same error format as the HTTP Protocol. If an error happens because of a shitty SOCKS client, use socket-trust or socket-stability as the error cause.
The Browser UI requires a Site Sidebar that features all controls that are relevant for the current structure of the website, which means it requires the HTML Optimizer
and CSS Optimizer
to be integrated first.
Afterwards the Session/info()
service method should return the given resources of the current file. It might be useful to have this on-disk after each cache.write()
call, as the files will not change once they are on-disk-cache. If they are refreshed in future, the cache.remove()
call should also remove the meta information of the files.
inspect()
method, not sure whether it makes more sense in Optimizer
or Parser
for each file format. This has to be delegated on a per-url-basis, which means it has to be placed into a server-side top-level Service
method.inspect/
folder changes once a cache.write()
call is done.cache.remove()
method with the removal of the meta/
url.The new recursive-descent Parser is now implemented and somewhat stable in its featureset. In order to implement the first language, the CSS Parser has to be written from scratch in order to have a real AST that's not limited by a line-by-line parsing approach.
The problem with CSS's specifications is that they are very unclear in regards to syntax, as they heavily rely on a generic "non-ASCII ident token" which pretty much can be everything, including the ๐ฉ emoji.
This issue tries to keep track of the relevant CSS specifications and their implications (to have a feature list of things that are not yet implemented).
@keyframes
animation
propertiesReasons not to implement Media Queries Level 4 and 5: expressions inside the media-feature(s) can be endlessly chained, which is very likely malicious to some extend. Therefore >=
, >
, <
, <=
as operators are not being implemented, and only the and
and or
syntax will be supported.
@media
@supports
and
, or
and only
.only
as a keyword has no effect, and is only there to make legacy Browsers ignore the media query.@media
or @supports
rule and therefore can contain both style-rule
and at-rule
inside them.Values and Units Module Level 4
Numeric Data Types:
integer
data typenumber
data typedimension
valuespercentage
valuesratio
data typeLength Data Type
Other Quantities
Other Data Types
color
data typeimage
data typeposition
data type (2d positioning)Functions
attr()
function@page
ruleThe Browser UI Refactor that started with 40e9efc has the ultimate goal to have Widgets that can be instanciated on a per-data structure basis. Each Widget represents the complete data structure via their this.model={}
property, whereas values and writable states are tracked in the update
event of the Widget's element.
The goal behind this is to have no shitty shared-across-iframe-helpers that polluted the global scope and made interaction between the iframe and the window's parent a nightmare.
Implement browser/design/card/Host
Implement browser/design/card/Mode
Implement browser/design/card/Beacon
Implement browser/design/card/Peer
Implement browser/design/card/Redirect
Implement browser/design/card/Session
Implement browser/design/card/Tab
Implement browser/design/card/Settings
Reimplement stealth:welcome
Page
Reimplement stealth:fix-host
Page
Reimplement stealth:fix-mode
Page
Reimplement stealth:settings
Page
The Browser needs an overview of Sessions and their Requests (done by each Session).
Currently the cache and stash both have a local file path that's based on the URL's domain/host and path alone, which leads to conflicts when APIs (like a /search.json
) or websites (like a /search/results.html
) return content based on a URL's GET parameters.
In order to improve this, the Cache and Stash service both need integration for the URL's parsed query string.
Policies are strictly optional. If a Policy is available, it specifically whitelists all allowed query parameters. If no Policy is available, all query parameters are allowed. This should be respected in the Cache and Stash services.
Request
API.The Policy Service data structure has been changed. The UI Card needs fixes to reflect that.
Now, a single policy entry consists of {domain,policies:[{path,query}]}
which means that the UI Card has to integrate a custom value()
method that dynamically changes the article
element's contents.
value(value)
methodvalue()
methodsave
action, show a footer that allows adding an entry to the policies[]
ArrayThe DNSS Protocol needs to be implemented to support DNS over TLS
via TCP. As the underlying architecture behind nodejs' net.Socket
, dgram.Socket
and tls.TLSSocket
differ heavily in their API usage, DNSS probably will include some redundancies as the code from DNS.receive()/send()/upgrade() cannot be reused as DNS defaults to the UDP
protocol.
Technically, as of RFC 7766 recursive DNS resolvers must have support for DNS via TCP.
Real-world analysis, however, has shown that all Web Browsers (old Opera, Chromium based Browsers, Firefox based Browsers, and Microsoft Edge) use only UDP
to request details about a domain. If there are multiple questions asked about a single domain, they are all split into separate queries with incrementing identifiers; which kind of defeats the purpose but anyways.
The intent for the legacy (unencrypted) DNS protocol here is to emulate other Browser's behaviour in order to be not identifiable as a Stealth Browser. As of today not a single Browser supports DTLS sockets for their DNS requests.
However, due to very very buggy implementations leading to the DNS Connection refactor due to lack of support for DNS over HTTPS in a standardized, sane manner, the DNSS Connection won't use DTLS and instead go for DNS TLS via TCP connections; which all tested DNS over TLS servers seem to support.
The internal stealth:fix-request
Page requires the Peer.proxy()
API to be working as expected and needs further integration with the Cache.info()
API.
Download Assistant
Settings.query({peers:true})
request Peer.proxy()
for the cache.info()
service.info
result is not null
, render table row for the peer with the Download button.cache.read()
service call via Peer.proxy()
and fill the cache via Cache.save()
.Cache.save()
method public for a peer with the remote address 127.x.x.x
or hostname being localhost
.Currently, the debugging mode sucks and loses content of the console.log()
calls. If a test fails and is being implemented or is being debugged, it is very hard to interact with the console API as the Renderer's render() loop always overrides all contents of the terminal, and uses console.clear()
for its animations.
So, the --debug=true
flag should implement a custom partial rendering mode that should only render the differences from the last review and last test to the current review and current test.
render_partial(reviews, prev_state, curr_state)
method in Renderer.console
as second parameter for each test instance, instead of the old debug()
method which was kind of pointless.state
of the Review instance, whereas null
represents that it wasn't executed and wait
represents it was executed but timed out.The request/*.mjs
Modules need reviews.
Blocker.check
Downloader.check
Downloader.download
Optimizer.check
Optimizer.optimize
Additionally, the following Core Implementations need reviews, too:
User wants to understand functionality of buttons without reading a manual.
Rollover could trigger display of an alt info or info in the status bar.
Stealth also requires an HTML parser that should rewrite the contents based on the current config
(aka Site Mode).
<script>
elements<style>
via CSS Parser<link rel="stylesheet">
via CSS Parser<img>
based on mode.image
<audio>
based on mode.audio
<video>
based on mode.video
<object>
elements:<tab id>,webview:
flags - otherwise the tab history will be messed up.The covert
cli command needs support for the watch
action which should observe for filesystem changes using fs.watchFile()
and then update and show the differences between the last run and the current run.
In order to present this nicely, something like a diff view (if the process.stdout.rows allow it) would be nice, whereas the left side should represent the old state and the right side should present the new state.
watch
action in covert.mjs
.Filesystem
abstraction in covert/source/Filesystem.mjs.settings.render == 'watch'
type in the Renderer.The Beacon Service is a service that returns all the beacons for each URL that matches the request. In order to remove complexity and duplication of algorithms, the idea behind the payload
in the service is that the path
attribute can actually contain a wildcard selector *
in order to reflect multiple matching patterns.
UI-wise and Service-wise (while save() is called), it has to be made sure that only one *
is contained in the path
attribute.
{
"domain": "sub.domain.tld",
"path": "/news/world/articles*",
"beacons": [{
"label": "headline",
"select": [ "#article h1" ],
"mode": {
"text": true,
"image": false,
"audio": false,
"video": false,
"other": false,
}
}, {
"label": "article",
"select": [ "#article p:nth-of-type(1)", "#article p:nth-of-type(3)" ],
"mode": {
"text": true,
"image": true,
"audio": false,
"video": false,
"other": false,
}
}]
}
Additionally, these service methods have to be created:
read({ domain, path })
that returns the found beacon for the identical domain and path attributequery({ domain, path })
that returns all matching beacons (an Array of beacons) for both wildcard-containing domains and paths, similar to how the Session service handles this.save({ domain, path, payload: { beacons }})
remove({ domain, path })
that removes the beacon for the identical domain and path attributeThe Address Widget doesn't show the correct URL when the iframe load leads to a redirect.
In order to fix this, the Webview Backdrop must include a method that will re-set (override?) the Tab's URL and change the history accordingly. Currently, inside the iframe
based UI, it's not possible to detect a redirect otherwise.
On the server-side it's implemented correctly already, but this is a necessary hack that cannot be avoided.
In the on('load', () => {})
callback:
this.window.location.href
must be parsed as if it were an src attribute)this.url
attribute), modify the Tab's history without triggering a refresh
event on the Browser.The Browser
needs a better API to be scriptable in node.js. A browser.download(url)
should abstract the setup of a client and do the request via the Session/request()
service method.
browser.download(url)
.Browser.mjs
.Session/download()
method.The URL parser isn't failsafe at this point in time.
Similar to the upcoming HTML and CSS Parser, the URL Parser will need a filter()
method that allows to filter out malicious parts of the URL, which means that the query
parameter needs to be re-rendered.
At this point, when Stealth visits and requests a tracking intensive website, these websites tend to embed their tracking features inside the URL parameters, directly inside the href="..."
values. For example, Google will redirect to a static page with those rewritten server-side rendered URLs if you did not request a javascript at a later time.
I think the most failsafe way to do this is to introduce a policies.json
file that describes URL patterns in a domain-specific way (similar to beacons and echoes), but I do not think that this would make much sense to be implemented in a Browser Page as the concept of URL patterns (and seing what is a tracking parameter and what is e.g. a content hash) is too complex for the average user.
So it might make sense to apply a similar strategy as with the blockers.json
file here, and generate the required metadata in this repository and ship it as a vendor-profile (and with pulled updates later).
Implement URL.filter(url, policy)
method
Describe a format for policies.json
that can reflect URL patterns, with *foo
, foo*
and foo*bar
syntax.
Allow incremental patterns and default overrides by implementing a sorting algorithm for policies, whereas the lowest matching name (e.g. startsWith('/')
) has the lowest ranking.
Implement a Policies
Service that returns these on a domain-specific basis, similar to the Blockers
Service.
The Policies
Settings are read-only, just as the Blockers
Settings are, and they are incrementally loaded from the Vendor Profile for now (until an end-to-end UI/UX strategy is available).
Wildcards (*
) will describe the startsWith/endsWith/includes based pattern matching for the query property's key
or the path's string.
Stealth crashes after entering any URL or search query. Native URLs (stealth:
) work.
I tested this with Firefox 68 on Windows and Chrome on Android. Stealth was started in Docker container.
Here are the logs:
(L) Stealth Service Command-Line Arguments:
(L)
{
"profile": "/profile",
"root": "/browser",
"debug": false
}
(I) Stealth Service started on http+ws://localhost:65432.
(I) Stealth Defaults loaded.
(L) > 379955 blockers, 0 filters, 0 hosts, 0 modes, 0 peers, 0 redirects.
(I) Stealth Profile loaded from "/profile".
(L) > 0 blockers, 0 filters, 0 hosts, 0 modes, 0 peers, 0 redirects.
file:///browser/stealth/source/Server.mjs:68
let session = this.stealth.init(null, request.headers);
^
ReferenceError: request is not defined
at Server.handle_request (file:///browser/stealth/source/Server.mjs:68:40)
at HTTP.receive (file:///browser/stealth/source/Server.mjs:342:22)
at Object.receive (file:///browser/stealth/source/protocol/HTTP.mjs:593:5)
at Socket.socket.once (file:///browser/stealth/source/Server.mjs:329:10)
at Object.onceWrapper (events.js:281:20)
at Socket.emit (events.js:193:13)
at addChunk (_stream_readable.js:295:12)
at readableAddChunk (_stream_readable.js:276:11)
at Socket.Readable.push (_stream_readable.js:231:10)
at TCP.onStreamRead (internal/stream_base_commons.js:154:17)
The current implementation of the DNS Protocol only supports fixed DNS over HTTPS requests. That was a good start, but in order to support more DNS servers out there (including DNS via TLS), it's necessary to implement an encoder and decoder inside the DNS Protocol that can be reused in an encrypted manner.
In order to to have the same conventions, DNS via TLS will be named DNSS (as it's TCP based anyways), and DNS itself will be UDP based. The Multicast DNS Protocol will be named MDNS and will implement the service-based discovery aspects that will allow to find local peers in the same NAT automatically.
This will further allow to use and observe DNS exfiltration techniques in the local network in order to break out of NATs that block HTTPS requests.
In the next iteration, the stealth/Server
will also handle DNS requests in order to use DNS as a network protocol to connect to the service and handle API calls (when SRV
is used) and handle host requests (when A
, AAAA
etc are being requested). How the Multicast DNS structure of the protocol will look like is currently unclear and needs further research; though it might make sense to use the same structure as AirPrint, AirDrop etc. for the sake of conventions and future compatibility, so the announced service will probably be stealth._tcp._local
and stealth._udp._local
.
protocol/DNS:
upgrade()
to be able to host a custom DNS server.protocol/DNSS:
... this is TODO when DNS has been implemented, and will use the DNS Protocol implementation similar to how HTTPS reuses the HTTP Connection.
DNS via HTTPS implementation:
This implementation will be moved to the server/Host
service, and the DNS.resolve() calls will be ported into a helper method as resolve_host()
.
The console implementation is currently inside the stealth/source/console.mjs
and browser/source/console.mjs
whereas it should be contained in the base library as a polyfill for both node.js and the browser.
stealth/source/console.mjs
to base/source/node/console.mjs
browser/source/console.mjs
to base/source/browser/console.mjs
console.mjs
into base/bin/base.sh
stealth/source
to new BASE.mjs
pathbrowser/source
to new BASE.mjs
pathThe Multicast DNS Protocol should transparently implement DNS-based Service Discovery (DNS-SD.org) and use the SRV
and PTR
queries/responses workflow.
_stealth._wss.tholian.network
to connect to the Radar service to discover global peers._stealth._wss.tholian.local
to the Multicast DNS address (224.0.0.1
or ff02::fb
respectively).wss
due to lack of TLS certificate or socket-trust issues, fallback to ws
and mark connection as untrusted - which will require manual confirmation of the user.The Echoes Service represents user-recorded actions and interactions with the Browser UI and Browser Websites. In order to reflect macro-like automation, this service has to implement specific actions that can be triggered either in order or after a given timeout or other interaction.
How this data structure will finally look like is still up-to-draft, but it will have a similar pattern matching structure like wildcard-allowed domain and path attributes.
I have some suggestions for Web App Manifest (currently manifest.json
):
It should be renamed to site.webmanifest
.
Although it is not required, it is reccomended by W3C that manifest uses application/manifest+json
media type and .webmanifest
extension.
It should use "display": "standalone"
.
"The application will look and feel like a standalone application. This can include the application having a different window, its own icon in the application launcher, etc. In this mode, the user agent will exclude UI elements for controlling navigation, but can include other UI elements such as a status bar." (from MDN)
It's short_name
should probably be Stealth
(with upper first letter).
Some additional icons may be added. You can use Real Favicon Generator for this.
The old DNS Connection implementation had a centralized and easy-to-use DNS.resolve()
method that was used by the Hosts Service. The new generic DNS transport implementations (that also focus on Multicast Compatibility) have no support for a generic resolve() method, and therefore this method needs to be implemented in a different DNS Ronin interface.
The stealth.server.RESOLVER
should implement the old resolve()
method and offer a centralized, easy-to-use API that abstracts away all the ronin's capabilities. As this DNS Ronin implementation should be a cross-protocol AND cross-server DNS ronin, it will maintain a list of known-to-work DNS servers that have either support for DNS via HTTPS, DNS via TLS, and then fallback to DNS via UDP only in the absolute worst case.
RESOLVER.resolve()
APIThe Stealth Webproxy and Webserver need support for 206 Partial Content requests, which would allow to serve content from the cache
in the profile as streams (e.g. video and other large files).
If a request contains a bytes=start-end
header, serve it from the cache while preserving the correctly expected payload.length
.
Verify that stealth/packet/HTTP
Parser integrates packet.headers['@transfer']['range']
correctly.
Integrate 206 support into Webserver
Integrate 206 support into Webproxy
User wants to close a tab in the sidebar.
Context Menu or a little X on the side when hovering the element would do.
The CSS parser is now almost ready to be used in practice. In order to integrate it with the Browser UI, it needs the following features (that are still missing):
Implement attr()
support in parse_value()
Implement calc()
support in parse_value()
Implement hsl/a color support in parse_value()
and convert colors to rgba
format
Implement CSS.isCSS()
method that verifies CSS tree structure, similar to parser/IP and parser/URL's behaviours.
Afterwards, the CSS parser requires render()
integration, so that the CSS Optimizer can parse and render the filtered CSS files correctly:
NORMAL.mjs
to parse/render signature for each property.SHORTHAND.mjs
to parse/render signature for each property.When setting preferences on macOS, reloading the whole page (session?) results in your previous settings being reset.
The Browser Settings Card needs a lookup feature for blockers
if the domain
matches one of the settings.blockers
.
Blocker
Card but with read-only methods, as the private API doesn't allow creating a blocker.Blocker.from()
into the Browser Settings Cardabout is an internal URI scheme implemented in various Web browsers to reveal internal state and built-in functions. It is an IANA officially registered scheme, and is standardized.
Currently, Stealth only supports stealth
URI scheme. It would be good if it would also support standaralized about
URI scheme. That URIs should probably be translated into the appropriate stealth
URIs.
Complete list of all about URIs is on Wikipedia.
Currently, Covert tests will fail in different network scenarios - as e.g. DNS, Request, Peer and Cache infrastructure heavily relies on external network communication.
In order to reproduce this more easily in future, covert
should support a --network
flag that sets up the network connection before running the given reviews. Afterwards it should clear up all remains that are left over by ip
and tc
.
Stealth needs a new stealth:tasks
Page that shows the overview of scheduled tasks that are regularly run, and a manual tasks overview. A task is a new data structure that also should be integrated as a Task Service
, and should also be able to be delegated via trusted Peers.
A task probably will consist of the following properties:
The stealth:tasks
Page should at least contain these features:
Currently, the Browser's Client API requires a global WebSocket
API available, which is only the case for the WebView/headles platform as of now.
In order to make the Browser available on the node.js side, the POLYFILLS.mjs should also include a WebSocket
polyfill that implements the client-side WS13 protocol.
WebSocket()
API usage to have a 1:1 replacement.base.mjs
file.that's it - just wanted to share
The stealth:history
page needs to be implemented.
The idea for the Stash
and Cache
card widget is to offer a search functionality that will display results sorted by domain, and allows to interact with the cache (and stash) data that is stored in the Stealth Profile.
Currently, these things are a bit unclear on what's the best way to implement it. In order to have the same level of functionality per-domain AND per-URL, the metadata and external resources of a URL have to be stored in a database that is updated once a URL is re-downloaded or re-optimized.
So it would make sense to have a server-side service that keeps track of metadata of URLs, including all urls that are related to each other (given it is easy to implement and doesn't blow up complexity).
It might make sense here to differ between Pages
and Assets
in general. Pages are URLs that the user visited and browsed to specifically, whereas Assets are external resources that are necessary to display the Page, given the Site Mode. This would imply that a change of a Site Mode triggers a refresh
call in the Browser UI and leads to the recursive metadata update of a Page.
In order to prevent a recursive update scenario that never ends, it might make sense to limit the dependency chain and set the first-level of the dependency graph to be a Page (aka text mode type) or a directly downloaded Asset (aka other mode type).
The Hosts Parser needs two methods:
/etc/hosts
file format.The Browser's internal pages need to be migrated to use the new ENVIRONMENT.mjs
file, which offers a cleaner way of parsing all flags and parameters. Currently, each settings or error page parses all its settings and URL parameters for themselves, which is very redundant.
export parameters = ...
browser/internal/*.html
to ENVIRONMENTThe stealth:fix-request
error page needs integration to download cached URLs from connected Peers - and in case no Peer is available - the Web Archive.
https://web.archive.org/<original url>
which will redirect the request to the latest known and scraped archive entry.The CSS parser currently cannot parse logical conditions and pretty much lacks the featureset to parse everything in the https://www.w3.org/TR/css3-conditional/ specification.
The RULE.mjs
file is created, but isn't integrated anyhow. In order to do so, the CSS.parse()
method in parser/CSS.mjs
needs to get support for proper body parsing and splitting. Currently, the parsing logic is line-based, do it instead needs to keep track of each nested {
and }
hierarchy.
The idea is that the RULE.mjs
methods can get passed-through their own body
signature and delegate everything correctly to the NORMAL/SHORTHAND parsers. The issue here is that this currently heavily relies on the CSS.parse()
method, and this is a broken concept that will never work out for this.
RULE.mjs
into CONDITION.mjs
as this is the logical-condition-parser containing file.CSS.parse()
method into a delegation pattern using conceptCSS.parse()
into an AST structure that can support logical and logical nested conditionsCSS.parse()
to use a scope-based parser that also respects properties that could have e.g. content: '}'
in their values.The stealth:media
page needs to be implemented.
The idea behind the separate media page is that it includes a Media
card with full width and height, and it allows to display image
, audio
and video
files with a given specific URL. That means it must have support for stealth:media?url=
parameter, and reuse what the ENVIRONMENT.flags.url
object.
browser-widget-image
widget, which accepts a {source}
parameter.browser-widget-audio
widget, which accepts a {source}
parameter.browser-card-video
widget, which accepts a {source}
parameter.type
image
, display a <img>
element.type
audio
, display an <audio>
element.type
video
, display a <video>
element.Currently, it is unsure whether it's possible to support the srcset
attributes that are only available for <picture>
elements, because it would be nice to have the same API for audio/video elements, too, in order to be able to download video streams based on the configured connection bandwidth in the Browser settings.
Simply using a <source media="...">
attribute here won't do the trick, as Blink
has removed support for it while Gecko
kept its support for it. [1]
[1] http://thenewcode.com/820/Make-HTML5-Video-Adaptive-With-Inline-Media-Queries
Some Browsers, namely Safari Mobile, have no real debugging capabilities in the sense that you can easily show the console drawer and the output there.
As remote-debugging in general is real pain, including non-working-but-sometimes-working iOS Debug Proxy, the general idea is to offer a Developer Console in the Interface Settings.
It might make sense to backport the design and mechanics of the Polyfillr Console as it's easily portable into a Web Component.
The general issue is that the Base Library by concept cannot contain Web Components, therefore the Developer Console has to be integrated as something like a <browser-console>
element into the Browser's Design.
Also, as the console must be integrated independent of the Browser's initialization, and additionally must be available as a listener for window.onerror
, which means that it has to be dispatch()
-ed before anything else, for both the main
window and the <browser-webview>
's iframe
.
The stealth:search
Page needs an Online and Offline search integration.
For now, the following search engines seem promising when it comes to their APIs that do not require tokens and/or user-specific authentication information in order to use them:
wiby.me
can be integrated with a simple JSON request to https://wiby.me/json/?q=key%20words&o=15
whereas the first result page doesn't need an o=...
parameter. The results are returned back in batches of 15 results.
searx.me
(and all instances) has actually a very nice API that's documented well [1] and also allows json as a response format via https://searx.xyz/search?q=key%20words&format=json
. The results are returned in pages and the pageno
parameter accepts 1
or higher numbers. But, if no results are returned, the JSON is basically an empty array. There's seemingly no way to find out whether or not page 1 includes all found results or not.
searx integration might need something like an engines
list that is a comma separated parameter in the request url. The list is pretty huge, but is also documented [2]
The Web Archive API is currently totally unclear, because there seems to be only outdated information about it. This probably needs some investigation about the source code that's being used on web.archive.org.
[1] Search API
[2] Search Engines
The CSS Optimizer needs to be integrated and offer a default set of blocked CSS properties, rulesets and conditions. What is to be blocked shall be a simple BLACKLIST-like functional format that should be a list of property names and their functions that should filter out the CSS per-rule(?) or per-element(?).
Currently it is unclear how to implement a per-element parser without a DOM model on the stealth-side, which is therefore dependent on the implementation of the HTML Optimizer
that should come first.
Application crashes on startup ocassionally
No crashes please
Sometimes crashes
Clone repo
Follow quickstart guide
ChromeDriver 83.0.4103.116 (8f0c18b4dca9b6699eb629be0f51810c24fb6428-refs/branch-heads/4103@{#716})
Window manager: wayland sway version 1.5
The Redirect Service data structure has been changed. The UI Card needs fixes to reflect that.
Now, a single redirect entry consists of {domain,redirects:[{path,query,location}]}
which means that the UI Card has to integrate a custom value()
method that dynamically changes the article
element's contents.
value(value)
methodvalue()
methodsave
action, show a footer that allows adding an entry to the redirects[]
ArrayAs of 0d4e391 most parts have been migrated from the former bash scripts to ESM modules.
However, these are still unported (and kind of undecided whether they make sense in their current form):
bin/generate-profile.sh
requires wget
to download the external adblock filter lists. Maybe it's too complex to port the profile generation to nodejs completely thereof.
Migrate the Browser process spawns in browser/bin/browser.sh
to browser/bin/browser.mjs
Create covert/bin/covert.mjs
and implement the build workflow as good as possible.
Some things currently have external dependencies, e.g. gcc
and make
are required in covert/bin/covert.sh
as of now. These possibly cannot be replaced with other alternatives, though they would have to be documented (or printed to stdout?) in case users want to run the node modules with the specific flags for which they are necessary.
Settings page font (or maybe all page) is black.
It would be nice to have some readable text
Clone master
Follow quickstart guide
Open settings
ChromeDriver 83.0.4103.116 (8f0c18b4dca9b6699eb629be0f51810c24fb6428-refs/branch-heads/4103@{#716})
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.