Giter VIP home page Giter VIP logo

home_sales's Introduction

Home_Sales_Analysis

I used my knowledge of SparkSQL to determine key metrics about home sales data. Then I used Spark to create temporary views, partition the data, cache and uncache a temporary table, and verified that the table has been uncached.

  • I answered the following questions using SparkSQL:

    1. What is the average price for a four-bedroom house sold for each year? Round off your answer to two decimal places. Alt text
    1. What is the average price of a home for each year it was built that has three bedrooms and three bathrooms? Round off your answer to two decimal places. Alt text
    1. What is the average price of a home for each year that has three bedrooms, three bathrooms, two floors, and is greater than or equal to 2,000 square feet? Round off your answer to two decimal places. Alt text
    1. What is the "view" rating for homes costing more than or equal to $350,000? Determine the run time for this query, and round off your answer to two decimal places. Alt text
  • I cached my temporary table home_sales.

  • Checked if my temporary table was cached.

  • Using the uncached data, I ran the query that filtered out the view ratings with an average price of greater than or equal to $350,000.

Alt text

  • Using the cached data, I ran the query that filtered out the view ratings with an average price of greater than or equal to $350,000. Determined the runtime and compared it to uncached runtime.

Alt text

  • I partitioned by the "date_built" field on the formatted parquet home sales data.

Alt text

  • I created a temporary table for the parquet data.

  • I ran the query that filtered out the view ratings with an average price of greater than or equal to $350,000. Determined the runtime and compared it to uncached runtime.

Alt text

  • I uncached the home_sales temporary table.

  • I verified that the home_sales temporary table is uncached using PySpark.

Runtime

  1. Uncache Runtime: 1.368062973022461 seconds
  2. Cached Runtime: 0.4902327060699463 seconds
  3. Runtime with the parquet DataFrame: 0.9032614231109619 seconds

Conclusion

  • Based on the time table, it is evident that executing a query on the cached version of the table was the most efficient option.

home_sales's People

Contributors

chiomauche avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.