Skip to content
/ SMPE Public
forked from alegrand/SMPE

Series of lectures on Scientific Methodology and Performance Evaluation

Notifications You must be signed in to change notification settings

msebaa/SMPE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 

Repository files navigation

Scientific Methodology and Performance Evaluation for Computer Scientists

Reporting errors: Although I do my best there may definitely be typos and broken links. This is github so please report me everything you find so that I can improve for others. :)

This website gather the series of lectures on applied performance evaluation avalaible that I was invited to give in various occasions. I have given this course with Jean-Marc Vincent several years (roughly from 2011 to 2015) in the second year of the Master of Science in Informatics at Grenoble and more recently at Federal University of Rio Grande do Sul, at ENS Lyon and ENS Rennes. All Most sessions I gave are listed here.

This website also gathers a list of keynotes I gave on reproducible research. The most recent ones are:

Table of Contents

Course Objective and Organization

The aim of this course is to provide the fundamental basis for sound scientific methodology of performance evaluation of computer systems. This lecture emphasize on methodological aspects of measurement and on the statistics needed to analyze computer systems. I first sensibilize the audience to the experiment and analysis reproducibility issue in particular in computer science. Then I present tools that help answering the analysis problem and may also reveal useful for managing the experimental process through notebooks. The audience is given the basis of probabilities and statistics required to develop sound experiment designs. Unlike some other lectures, my goal is not to provide analysis recipes that people can readily apply but to make people really understand some simple tools so that they can then dig deeper later on.

Originally, the course is organized in 5 very dense lectures of 3 hours:

  1. Reproducible research. A video of a similar presentation (a mixture of lecture 1 and 2 actually) is available on canal-u (Part 1 and Part 2) and graal (Part 1 and Part 2).
  2. Data visualization/presentation.
  3. Introduction to probabilities/statistics.
  4. Linear regression.
  5. Design of Experiments.

I’m progressively slicing it up in smaller chunks as follows:

  1. Epistomology
  2. Reproducible research
  3. Literate programming
  4. R crash course
  5. Descriptive statistics of univariate data
  6. Data presentation (checklist for good graphics: tableau, guidelines)
  7. Correlation and causation
  8. Introduction to probabilities/statistics (Proof of the Central limit theorem, a few slides on the $χ^2$ test)
  9. Linear regression
  10. Design of Experiments

All the examples given in this series of lecture use the R language and the source is provided so that people can reuse them. The slides are composed with org-mode, beamer, and verbments.

As an exercise, I often propose the audience to work in small groups and to provide me with a clean and reproducible analysis of a simple scientific question of their choice. Alternatively, they can fork this small project which is toy experiment regarding measuring the performance of a simple parallel quicksort implementation:

One of your colleague just implemented a multi-threaded version of the quicksort algorithm for multi-core machines. He’s convinced his code can save significant time saving but unfortunately, he did not follow the performance evaluation lecture and he would like your help to promote his code.

Students should fork it, play with it and possibly improve (it’s not hard as the stub is purposely basic) and then send me with their git URL so that I can comment on it.

In practice, I introduce the audience to the following tools:

  • R and ggplot2 that provide a standard, efficient and flexible data management and graph generation mechanism. Although R is quite cumbersome at first for computer scientists, it quickly reveals an incredible asset compared to spreadsheets, gnuplot or graphical libraries like matplotlib or tikz.
  • knitR is a tool that enables to integrate R commands within a LaTeX or a Markdown document. It allows to fully automatize data post-processing/analysis and figure generation down to their integration to a report. Beyond the gain in term of ease of generation, page layout, uniformity insurance, such integration allows anyone to easily check what has been done during the analysis and possibly to improve graphs or analysis.
  • I explain how to use these tools with Rstudio, which is a multi-platform and easy-to-use IDE for R. For example, using R+Markdown (Rmd files) in Rstudio, it is extremely easy to export the output result to Rpubs and hence make the result of your research available to others in no more than two clicks.
  • I also mention other alternatives such as org-mode and babel or the ipython notebook that allow a day-to-day practice of reproducible research in a somehow more fluent way than knitR but I am probably not fully objective here. :)
  • I present the basis of probabilities and statistics and explain how to compute confidence intervals and to perform linear regressions and analysis of variance with R.
  • I give an overview of the main classes of experiment designs and I explain how to easily generate and analyze them using RcmdrPlugin.DoE from the Rcommander GUI.

Using R

Installing R and Rstudio

Here is how to proceed on debian-based distributions:

sudo apt-get install r-base r-cran-ggplot2 r-cran-reshape 

Make sure you have a recent (>= 3.2.0) version or R. For example, here is what I have on my machine:

R --version
R version 3.2.0 (2015-04-16) -- "Full of Ingredients"
Copyright (C) 2015 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under the terms of the
GNU General Public License versions 2 or 3.
For more information about these matters see
http://www.gnu.org/licenses/.

Rstudio and knitr are unfortunately not packaged within debian so the easiest is to download the corresponding debian package on the Rstudio webpage and then to install it manually (depending on when you do this, you can obviously change the version number).

wget http://download1.rstudio.org/rstudio-0.99.484-amd64.deb
sudo dpkg -i rstudio-0.99.484-amd64.deb
sudo apt-get -f install # to fix possibly missing dependencies

You will also need to install knitr. To this end, you should simply run R (or Rstudio) and use the following command.

install.packages("knitr")

If r-cran-ggplot2 or r-cran-reshape could not be installed for some reason, you can also install it through R by doing:

install.packages("ggplot2")
install.packages("reshape")

Producing documents

The easiest way to go is probably to use R+Markdown (Rmd files) in Rstudio and to export them via Rpubs to make available whatever you want.

We can roughly distinguish between three kinds of documents:

  1. Lab notebook (with everything you try and that is meant mainly for yourself)
  2. Experimental report (selected results and explanations with enough details to discuss with your advisor)
  3. Result description (rather short with only the main point and, which could be embedded in an article)

We expect you to provide us the last two ones and to make them publicly available so as to allow others to comment on them.

Learning R

For a quick start, you may want to look at R for Beginners. A probably more entertaining way to go is to follow a good online lecture providing an introduction to R and to data analysis such as this one: https://www.coursera.org/course/compdata.

A quite effective way is to use SWIRL, an interactive learning environment that will guide through self-paced lesson.

install.packages("swirl")
library(swirl)
install_from_swirl("R Programming")
swirl()

I suggest in particular to follow the following lessons from R programming (max 10 minutes each):

1: Basic Building Blocks      2: Workspace and Files     
3: Sequences of Numbers       4: Vectors                 
5: Missing Values             6: Subsetting Vectors      
7: Matrices and Data Frames   8: Logic                   
9: Functions                 12: Looking at Data         

Finally, you may want to read this excellent tutorial on data frames (attach, with, rownames, dimnames, notions of scope…).

Learning ggplot2, plyr/dplyr, reshape/tidyR

All these packages have been developed by hadley wickam.

  • Although the package is called ggplot2, it provides you the ggplot command. This package allows you to produce nice looking and highly configurable graphics.
  • Old generation: plyr allows you expressively compute aggregate statistics on your data-frames and reshape allows you to reshape your data-frames if they’re not in the right shape for ggplot2 or plyr.
  • New generation: dplyr is the new generation of plyr and allows you to expressively compute aggregate statistics on your data-frames. tidyr is the new generation of reshape and allows you to reshape your data-frames if they’re not in the right shape for ggplot2 or dplyr. If you have a recent R installation, go for these new packages. Their syntax is better and their implementation is much faster.

I recently stumbled on this nice ggplot2 tutorial.

Hadley Wickam provides a nice tour of dplyr and gentle introduction to tidyR. Here is a nice link on merging data frames.

The Rstudio team has designed a nice series of cheatsheets on R and in particular one on ggplot2 and on R/markdown/knitr.

References

  • R. Jain, The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling, Wiley-Interscience, New York, NY, April 1991. A new edition will be available in September 2015.

    This is an easy-to-read self-content book for practical performance evaluation. The numerous checklists make it a great book for engineers and every CS experimental scientist should have read it.

  • David J. Lilja, Measuring Computer Performance: A Practitioner’s Guide, Cambridge University Press 2005

    A short book suited for brief presentations. I follow a similar organization but I really don’t like the content of this book. I feel it provides very little insight on why the theory applies or not. I also think it is too general and lacks practical examples. It may be interesting for those willing a quick and broad presentation of the main concepts and “recipes” to apply.

  • Jean-Yves Le Boudec. Methods, practice and theory for the performance evaluation of computer and communication systems, 2006. EPFL electronic book.

    A very good book, with a much more theoretical treatment than the Jain. It goes way farther on many aspects and I can only recommand it.

  • Douglas C. Montgomery, Design and Analysis of Experiments, 8th Edition. Wiley 2013.

    This is a good and thorough textbook on design of experiments. It’s so unfortunate it relies on “exotic” softwares like JMP and minitab instead of R…

  • Julian J. Faraway, Practical Regression and Anova using R, University of Bath, 2002.

    This book is derived from material that Pr. Faraway used in a Master level class on Statistics at the University of Michigan. It is mathematically involved but presents in details how linear regression, ANOVA work and can be done with R. It works out many examples in details and is very pleasant to read. A must-read if you want to understand this topic more thoroughly.

  • Peter Kosso, A Summary of Scientific Method, Springer, 2011. [hidden PDF that google found on the webpage of a university in Macedonia

    A short nice book summarizing the main steps of the scientific method and why having a clear definition is not that simple. It illustrates these points with several nice historical examples that allow the reader to take some perspective on this epistemological question.

  • R. Nelson, Probability stochastic processes and queuing theory: the mathematics of computer performance modeling. Springer Verlag 1995.

    For those willing to know more about queuing theory.

About

Series of lectures on Scientific Methodology and Performance Evaluation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 54.9%
  • TeX 27.9%
  • Makefile 17.2%