Using SparkSQL, determined key metrics about home sales data. Then, I used Spark to create temporary views, partition the data, cache and uncache a temporary table, and verify that the table has been uncached.
python
time
sql
big-data
spark
jupiter-notebook
colab-notebook
mashine-learning
findspark
sparkfiles
-
Updated
Aug 28, 2023 - Jupyter Notebook