artifacts track, FSE'18: https://2018.fseconference.org/home
researchart / fse18 Goto Github PK
View Code? Open in Web Editor NEWartifacts track, FSE'18: https://2018.fseconference.org/home
artifacts track, FSE'18: https://2018.fseconference.org/home
artifacts track, FSE'18: https://2018.fseconference.org/home
The artifact consists of a package related to A Multi-Track Automata Based Counter, namely MT-ABC.
The artifact is interesting and relevant for this venue.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes the source code.
The artifact package involves the scripts (along with the steps) to run a set of experiments and generate the results.
The artifact is well documented, consistent, and it incorporates evidence of validation.
I therefore believe that this artifact package can be useful for the research community working in this field. I suggest a Functional badge.
Artifacts Upload for EVENTNAME (Paper PAPERID) TRACKNAME
Dear AUTHORS,
Your artifact was positively evaluated by the artifact-evaluation committee (AEC)
and your publication will receive certain badges as a result of this process.
In addition, you have the option to publish your artifact (in a citable form),
and receive another badge "Artifact Available" for this.
Please upload your artifact metadata (and your artifact, if to be published with ACM)
via the following link:
SUBMISSIONLINKARTIFACT
The deadline for the upload (metadata and artifact) is ONEWEEKFROMNOW.
Publication of artifact via ACM DL:
By submitting your artifact to the ACM DL, you ensure
long-term availability, open access, and immutability for your artifact,
and receive a DOI (digital object identifier) for the artifact,
such that your artifact can be cited as publication.
It is not necessary to transfer copyright for the artifact to ACM:
ACM only requires a permission to distribute the artifact in the ACM DL,
which is assigned as part of the publishing-rights agreement for the paper.
Please make sure that you clicked that option in your publishing-rights agreement.
If you did not allow ACM to publish your material, please contact us
(and we enable a new publishing-rights form).
Publication of artifact via external provider:
If you have published your artifact already via Zenodo or FigShare
and you do not want to publish your artifact in the ACM Digital Library,
you still need to complete the metadata fields for the record in the ACM DL.
Please also open the URL above and complete our artifact-submission form.
GitHub or similar repositories are not sufficient to receive an "Artifact Available" badge.
If you so far host your artifact on GitHub (or similar), you can simply
download the release zip archive from GitHub and upload this file to us (for the ACM DL).
(Note that the zip file must contain a license and readme file.)
Or, even simpler, you can publish GitHub releases via Zenodo:
https://guides.github.com/activities/citable-code/
Best regards,
Your Proceedings Team
Conference Publishing Consulting
Timestamp | Your first and last names | Your email address | ID of your accepted FSE paper | Title of your accepted FSE paper | List of authors of your accepted FSE paper | Describe your artifact(s) (200 words or less) | What level of badge are you applying for? | Using the criteria defined in the "Call for Papers" section of https://conf.researchr.org/track/fse-2018/fse-2018-Artifacts, describe why you think your artifact deserves to badged "functional" or "reusable" or "available" or "replicated" or "reproduced". | Have you read the "Submission Procedure" at https://conf.researchr.org/track/fse-2018/fse-2018-Artifacts | URL of the zip file prepared as per that procedure | Your github id | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
6/20/2018 3:54:23 | Felix Pauck | [email protected] | 76 | Do Android Taint Analysis Tools Keep their Promises? | Felix Pauck, Eric Bodden, Heike Wehrheim | ReproDroid is a framework which can be used to create, refine and execute reproducible benchmarks for Android app analysis tools.In the paper, the framework has been used to check if certain promises for analysis tools hold. Therefore, six different analysis tools (Amandroid, DIALDroid, DidFail, DroidSafe, FlowDroid, IccTA) where evaluated on three different benchmarks (DroidBench, ICCBench, DIALDroidBench) as well as on some additional benchmark cases contributed along with ReproDroid.All results are described in the paper. Furthermore, the result data is publicly available and ready for download: https://FoelliX.github.io/ReproDroid | Reusable | - ReproDroid simplifies the benchmarking process for Android app analysis tools. Thereby, it is useful for studies in this field.- The new benchmark cases that are part of ReproDroid extend the existing and often used benchmark DroidBench.- The tools shipped with ReproDroid can be used in different contexts, e.g. to benchmark, run or combine Android app analysis tools.Additionally:- Our artifact is available on github: https://FoelliX.github.io/ReproDroid- Detailed tutorials on how to use the involved tools can be obtained online: https://github.com/FoelliX/BREW/wiki- The artifact may be used on the basis of a compiled executable or be compiled from its source code- The involved AQL-System (https://FoelliX.github.io/AQL-System) can be reused in other projects as a library e.g. via MavenFor these reasons we apply for the Reusable badge. | Yes | https://uni-paderborn.sciebo.de/s/Oyek7jUOkydefeB |
This artifact consists of a package related to a clone detector, Oreo, designed to find code clones in the Twilight zone.
The artifact is very interesting and relevant for this track.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes the source code made available by the authors under github.
The artifact package includes the input data needed for running the tool.
The artifact package also provide the material used for the tool evaluation involving files of precision and recall experiments. The data provided were used in the associated research paper.
Overall, I find that this artifact package is well presented making it easy, for the research community, to reuse for replication purposes. It can therefore make a nice contribution to the FSE’18 Artifact Track. Consequently, I assign to it a Reusable badge.
This artifact consists of material related to the research paper on Adversarial Symbolic Execution for Detecting Concurrency-related Cache Timing Leaks.
The artifact is interesting and relevant for this venue.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes the tarball file made available by the authors on the google drive.
The artifact package includes the benchmarks used in the research paper as well as the scripts needed to run these benchmarks.
Different alternatives for the execution of the approach have been shown by the authors.
The artifact is well documented and structured. In addition, it involves evidence of validation.
I marked it as of Functional Badge.
This artifact consists of a package related to a research work on the practical Ajax race detection for JavaScript web applications.
The artifact is very interesting and relevant for this track.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes all the instructions related to the installation of AjaxRacer.
The artifact package includes the procedure to view the results. It also provides ways to run the corresponding experiments.
Additionally, the artifact package presents a set of end-to-end tests with all the steps and commands to run the tests.
Finally, the authors provided a script to debug AjaxRacer.
The artifact is well documented and structured. It can make a great contribution to the FSE’18 Artifact Track. I believe it can be of a Resusable badge.
This is a pace for the PC chairs to dribble out notes on what words, does not work for artifacts
Low numbers. Why:
Poor conformance to the format we offered for artifact submission.
v.cool. interaction with artifact authors is much more complex than supported by, say, easychair. debates on what to download, what to keep at central site
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.