Replies: 8 comments 2 replies
-
I think that we should start with a simple refactor of
We can explore how to improve upon this once we've done some experimentation. |
Beta Was this translation helpful? Give feedback.
-
Here is a link to the fork containing the new numpy bayes opt chassis feature optimization-chassis. I created a test using the univariate regression tutorial here MuyGPyS/optimize/experiment/univariate_regression_with_opt_loop.ipynb. This test uses the new chassis and runs bayes optimizer once and successfully produces the same outcomes as the original tutorial. Next is to code for multiple epochs/batches by probing all previous points. Can we brainstorm at our next meeting on how best to test outcomes of multiple iterations and how to compute optimal sigma squared? |
Beta Was this translation helpful? Give feedback.
-
Ben, I have committed the baseline implementation of feature optimization-chassis and after I complete the unit test the fork should be ready for a pull request. Running the
|
Beta Was this translation helpful? Give feedback.
-
@bwpriest I added the flake8 linter to my Visual Code. I ignored E501(line length) and E203(space before colon), the latter conflicted with black formatter. The final set of errors appear to be associated with my changes to MuyGPyS/_src/optimize/chassis/init.py. I removed the violating includes that caused fails on mpi, torch, and docs test. I added those directly to MuyGPyS/optimize/chassis.py. I reran my experiment univariate notebook test successfully and the Abseil unit test optimize.py: OK Can we run another develop test on the pull request to check for remaining errors? Thanks! |
Beta Was this translation helpful? Give feedback.
-
Optimization loop experiments from last week using similar workflow from "Anisotropic Metric Tutorial":
|
Beta Was this translation helpful? Give feedback.
-
Min, here are the results from the experiment we brainstormed. I went back and refactored to use a single initial probe point or all previous probe points. The results improved. I would like to discuss the improvement, also to ensure I have the correct treatments, and a quick code review to make sure the workflow in the opt loop chassis is correct. Treatment 1 (baseline) - brute force workflow following Anisotropic Metric Tutorial using single optimization step with a single initial probe point:
Treatment 2a (opt loop) each loop iteration create new optimizer instance, initialize with new objective function, and do not probe previous points:
Treatment 2b (opt loop) each loop iteration create new optimizer instance, initialize with new objective function, and probe previous points:
Treatment 2c (opt loop) each loop iteration reuse optimizer instance, set
|
Beta Was this translation helpful? Give feedback.
-
Here's an idea for a brute force comparison:
|
Beta Was this translation helpful? Give feedback.
-
@bwpriest Below are the results after merging in the new 2d univariate sampler and bug fix. I also increased the dataset size and ran on a compute node at UNM CARC for evaluating loop execution time:
|
Beta Was this translation helpful? Give feedback.
-
We are beginning to explore models that change the structure of the data as we optimize, such as the anisotropic distortion and deep embedding kernels. We are also interested in exploring the effects of mini-batching. However, our current optimization chassis does not allow for this. Let's discuss how to generalize the optimization chassis, starting with the Bayesian optimization routine, to support this behavior.
Beta Was this translation helpful? Give feedback.
All reactions