Network Performance Framework: easy-to-use experiment manager with automated testing, result collection, and graphing
-
Updated
Nov 15, 2024 - Python
Network Performance Framework: easy-to-use experiment manager with automated testing, result collection, and graphing
Empirical Software Engineering journal (EMSE) open science and reproducible research initiative
⚗️ Aeromancy: A framework for performing reproducible AI and ML
awesome open list of pointers about open science for software and computational science
A simple Lucene framework to get started with Information Retrieval experiments on TREC documents
📊 Reproducible Benchmark for Everyone
Tool demonstrating building credit risk models
Print session information
Extra resources in the Collective Knowledge Format for ARM's Workload Automation Framework:
HESML Java software library of ontology-based semantic similarity measures and information content models
Open solution to the Google AI Object Detection Challenge 🍁
Open solution to the Santander Value Prediction Challenge 🐠
Open solution to the Home Credit Default Risk challenge 🏡
Framework for Reproducible ExperimenTs
Architecture and overarching documentation for o2r microservices
Details for reproducing the experiments in our d-blink paper
Examples for Executable Research Compendia and compatible workspaces
Source code and data for the paper "Data Assimilation in Large Prandtl Rayleigh-Bénard Convection from Thermal Measurements" by A. Farhat, N. E. Glatt-Holtz, V. R. Martinez, S. A. McQuarrie, and J. P. Whitehead.
Paper list and implementation (codes and results) of CNN-based single image super-resolution.
Experimental artefacts to reproduce our results published at SoCC 2017
Add a description, image, and links to the reproducible-experiments topic page so that developers can more easily learn about it.
To associate your repository with the reproducible-experiments topic, visit your repo's landing page and select "manage topics."