While gradient boosting methods have been 'in vogue' for long, there are a set of new age boosting algorithms that help you create better models. I recently have a talk on the new age boosting algorithms - XGBoost and Light GBM.
The presentation also shows why tuning can help you get better models. I have also shows a gradual example of how tuning helped me rank in the Top 7% of the Titanic Kaggle Competition. I will be posting the codes soon (after I have put in the comments)
Please feel free to reach out to me in case of any queries/feedback.