Giter VIP home page Giter VIP logo

nlp_paper_summaries's People

Contributors

amitness avatar dlozeve avatar e-tornike avatar omarsar avatar ricsinaruto avatar viktor2k avatar vrdn-23 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nlp_paper_summaries's Issues

Elaborate to other domains

@omarsar Thanks for starting this initiative.

I was wondering if we can expand this from nlp_paper_summaries to paper_summaries instead and group papers by the domain: "NLP", "Computer Vision" etc. There is no existing platform that curates such paper summaries in one place and so paper_summaries could fill that gap. Just a thought.

Critique-of-Taylor-s-Law-for-Human-Linguistic-Sequences [To Publish]

Got permission from @jadevaibhav to publish the following article on dair.ai's main website.

Article:
https://github.com/jadevaibhav/Critique-of-Taylor-s-Law-for-Human-Linguistic-Sequences/blob/master/README.md

Tasks:

  • Migrate the article to our main website repo: https://github.com/dair-ai/dair-ai.github.io/tree/master/_posts
  • Proofread and review and make necessary changes
  • Add author's information
  • Publish on the main website (dair.ai)
  • Add the entry to the NLP Paper Summaries repo (let's figure out under which folder we can add it)
  • Share on social media and our Slack group

Author's information:

jadevaibhav:
  name: Vaibhav Jade
  github:  jadevaibhav

Add your name to the contributors list

Please say hi and add your name below if you wish to contribute to this project. Make sure to link your GitHub account so that I can add you to this project.

adding oral presentations?

Someone on Twitter suggested adding oral presentations of those papers if they were available. Your thoughts?

CONTRIBUTING.md

Currently, we have different variants of naming hyperlinks:

Title Summary Paper Source TL;DR
Title Summary 1 Source TL;DR
Title Medium Source TL;DR
Title Firstname Lastname Source TL;DR
Title Summary 1, Summary 2 Source TL;DR
Title Medium, Summary Source TL;DR

The question is, which of these would make the most sense for the repository at the moment?

looking for maintainers

This is going to require a huge effort to maintain due to the nature of the project. I have already received very positive feedback on this idea and it would be nice to get volunteers to help maintain this. If you are interested, please email me to [email protected] or DM me on Twitter.

Summarize paper: Language Models are Few-Shot Learners

Language Models are Few-Shot Learners

https://arxiv.org/abs/2005.14165

Recent work has demonstrated substantial gains on many NLP tasks and benchmarks by pre-training on a large corpus of text followed by fine-tuning on a specific task. While typically task-agnostic in architecture, this method still requires task-specific fine-tuning datasets of thousands or tens of thousands of examples. By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do. Here we show that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even reaching competitiveness with prior state-of-the-art fine-tuning approaches. Specifically, we train GPT-3, an autoregressive language model with 175 billion parameters, 10x more than any previous non-sparse language model, and test its performance in the few-shot setting. For all tasks, GPT-3 is applied without any gradient updates or fine-tuning, with tasks and few-shot demonstrations specified purely via text interaction with the model. GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic. At the same time, we also identify some datasets where GPT-3's few-shot learning still struggles, as well as some datasets where GPT-3 faces methodological issues related to training on large web corpora. Finally, we find that GPT-3 can generate samples of news articles which human evaluators have difficulty distinguishing from articles written by humans. We discuss broader societal impacts of this finding and of GPT-3 in general.

paper summary recipes

Hi all,

I have this idea to potentially use the paper summaries here to help out students. I notice a lot of courses online and in universities recommend papers for students but this can be intimidating for some and even discouraging. What if we create "recipes" for students where we recommend them a journey on what paper summaries to look at first before jumping into the actual corresponding papers. Paper summaries are more approachable/friendly and can help guide students better before they jump into paper reading. This could be a nice addition as we keep expanding the list. Thoughts?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.