Giter VIP home page Giter VIP logo

fse18's Introduction

fse18's People

Contributors

ai4se avatar umangm avatar timm avatar danielguovt avatar defreez-ucd avatar zhhailon avatar saini avatar user2589 avatar silverbullettt avatar anon-5teymg7q avatar wangying8052 avatar

Stargazers

 avatar Chi Li avatar

Watchers

 avatar Christoffer Quist Adamsen avatar  avatar James Cloos avatar Xujie Si avatar  avatar  avatar Masud Rahman avatar Olga Baysal avatar Daniel Lehmann avatar  avatar Sebastian Baltes avatar  avatar Jiayi Wei avatar  avatar Farima avatar FoelliX avatar  avatar

fse18's Issues

Review by anon-rev-fse18

The artifact consists of a package related to A Multi-Track Automata Based Counter, namely MT-ABC.
The artifact is interesting and relevant for this venue.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes the source code.
The artifact package involves the scripts (along with the steps) to run a set of experiments and generate the results.
The artifact is well documented, consistent, and it incorporates evidence of validation.
I therefore believe that this artifact package can be useful for the research community working in this field. I suggest a Functional badge.

get acceptance text from acm publishing

Artifacts Upload for EVENTNAME (Paper PAPERID) TRACKNAME

Dear AUTHORS,

Your artifact was positively evaluated by the artifact-evaluation committee (AEC)
and your publication will receive certain badges as a result of this process.

In addition, you have the option to publish your artifact (in a citable form),
and receive another badge "Artifact Available" for this.

Please upload your artifact metadata (and your artifact, if to be published with ACM)
via the following link:
SUBMISSIONLINKARTIFACT

The deadline for the upload (metadata and artifact) is ONEWEEKFROMNOW.

Publication of artifact via ACM DL:
By submitting your artifact to the ACM DL, you ensure
long-term availability, open access, and immutability for your artifact,
and receive a DOI (digital object identifier) for the artifact,
such that your artifact can be cited as publication.

It is not necessary to transfer copyright for the artifact to ACM:
ACM only requires a permission to distribute the artifact in the ACM DL,
which is assigned as part of the publishing-rights agreement for the paper.
Please make sure that you clicked that option in your publishing-rights agreement.
If you did not allow ACM to publish your material, please contact us
(and we enable a new publishing-rights form).

Publication of artifact via external provider:
If you have published your artifact already via Zenodo or FigShare
and you do not want to publish your artifact in the ACM Digital Library,
you still need to complete the metadata fields for the record in the ACM DL.
Please also open the URL above and complete our artifact-submission form.

GitHub or similar repositories are not sufficient to receive an "Artifact Available" badge.
If you so far host your artifact on GitHub (or similar), you can simply
download the release zip archive from GitHub and upload this file to us (for the ACM DL).
(Note that the zip file must contain a license and readme file.)

Or, even simpler, you can publish GitHub releases via Zenodo:
https://guides.github.com/activities/citable-code/

Best regards,

Your Proceedings Team
Conference Publishing Consulting

i've changed the form structure (added github id). is felix's entry still good?

Timestamp Your first and last names Your email address ID of your accepted FSE paper Title of your accepted FSE paper List of authors of your accepted FSE paper Describe your artifact(s) (200 words or less) What level of badge are you applying for? Using the criteria defined in the "Call for Papers" section of https://conf.researchr.org/track/fse-2018/fse-2018-Artifacts, describe why you think your artifact deserves to badged "functional" or "reusable" or "available" or "replicated" or "reproduced". Have you read the "Submission Procedure" at https://conf.researchr.org/track/fse-2018/fse-2018-Artifacts URL of the zip file prepared as per that procedure Your github id            
6/20/2018 3:54:23 Felix Pauck [email protected] 76 Do Android Taint Analysis Tools Keep their Promises? Felix Pauck, Eric Bodden, Heike Wehrheim ReproDroid is a framework which can be used to create, refine and execute reproducible benchmarks for Android app analysis tools.In the paper, the framework has been used to check if certain promises for analysis tools hold. Therefore, six different analysis tools (Amandroid, DIALDroid, DidFail, DroidSafe, FlowDroid, IccTA) where evaluated on three different benchmarks (DroidBench, ICCBench, DIALDroidBench) as well as on some additional benchmark cases contributed along with ReproDroid.All results are described in the paper. Furthermore, the result data is publicly available and ready for download: https://FoelliX.github.io/ReproDroid Reusable - ReproDroid simplifies the benchmarking process for Android app analysis tools. Thereby, it is useful for studies in this field.- The new benchmark cases that are part of ReproDroid extend the existing and often used benchmark DroidBench.- The tools shipped with ReproDroid can be used in different contexts, e.g. to benchmark, run or combine Android app analysis tools.Additionally:- Our artifact is available on github: https://FoelliX.github.io/ReproDroid- Detailed tutorials on how to use the involved tools can be obtained online: https://github.com/FoelliX/BREW/wiki- The artifact may be used on the basis of a compiled executable or be compiled from its source code- The involved AQL-System (https://FoelliX.github.io/AQL-System) can be reused in other projects as a library e.g. via MavenFor these reasons we apply for the Reusable badge. Yes https://uni-paderborn.sciebo.de/s/Oyek7jUOkydefeB              

Review by anon-rev-fse18

This artifact consists of a package related to a clone detector, Oreo, designed to find code clones in the Twilight zone.
The artifact is very interesting and relevant for this track.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes the source code made available by the authors under github.
The artifact package includes the input data needed for running the tool.
The artifact package also provide the material used for the tool evaluation involving files of precision and recall experiments. The data provided were used in the associated research paper.
Overall, I find that this artifact package is well presented making it easy, for the research community, to reuse for replication purposes. It can therefore make a nice contribution to the FSE’18 Artifact Track. Consequently, I assign to it a Reusable badge.

Review by anon-rev-fse18

This artifact consists of material related to the research paper on Adversarial Symbolic Execution for Detecting Concurrency-related Cache Timing Leaks.
The artifact is interesting and relevant for this venue.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes the tarball file made available by the authors on the google drive.
The artifact package includes the benchmarks used in the research paper as well as the scripts needed to run these benchmarks.
Different alternatives for the execution of the approach have been shown by the authors.
The artifact is well documented and structured. In addition, it involves evidence of validation.
I marked it as of Functional Badge.

Review by anon-rev-fse18

This artifact consists of a package related to a research work on the practical Ajax race detection for JavaScript web applications.
The artifact is very interesting and relevant for this track.
The artifact adheres to the requirements of the FSE’18 Artifacts Track.
The artifact package includes all the instructions related to the installation of AjaxRacer.
The artifact package includes the procedure to view the results. It also provides ways to run the corresponding experiments.
Additionally, the artifact package presents a set of end-to-end tests with all the steps and commands to run the tests.
Finally, the authors provided a script to debug AjaxRacer.
The artifact is well documented and structured. It can make a great contribution to the FSE’18 Artifact Track. I believe it can be of a Resusable badge.

need to write a report for the conference

This is a pace for the PC chairs to dribble out notes on what words, does not work for artifacts

Random notes at submission time

Low numbers. Why:

  • ?low community interest in artifacts.
  • ? hard to distinguish from tool track.

Poor conformance to the format we offered for artifact submission.

  • Was the format wrong? In terms of light to heavyweight, it was on the lightweight side.
  • Is the community still divided on "right format"? The folks with a VM just ignored our notes and offered a VM. after a 3 minute download and a 4 minute "import to virtual box" session, we had their code working perfectly on a reviewer machine. should we just demand VMs all the time?

Working in Github

v.cool. interaction with artifact authors is much more complex than supported by, say, easychair. debates on what to download, what to keep at central site

Writing artifacts

  • Keep it short: 11BG (unzipped)? Wild. Maybe artifacts should be "teasers". Things we can fast download and peek out and taste before swallowing.
    • Note, some of the bigger downloads from Google Drive just timed out without downloading. Internet still needs debugging.
  • Art to authoring "cliff notes" (i.e. short intro). Need some support text in the repo. e.g https://github.com/researchart/fse18/blob/master/submissions/pattern-fuzzing/README.md. Note that these are the notes that a web surfer would use to decide if they want to use this or not.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.