Taylor Genetic Programming Pretrained Bayesian Symbolic Regression
For my final project in MIT junior lab (8.13), I combined two research papers using novel advances in symbolic regression. Taylor genetic programming, proposed by He et al. (arXiv:2205.09751) suggests that utilizing a Taylor polynmial to estimate our dataset can yield superior results in symbolic regression. In my research, I have seen the impact of pretraining large transformer models for jet analysis and now look for new pretrianing methods for ML models. I chose to use the Taylor genetic programming approach to pretrain a Bayesian symbolic regression model, proposed by Jin et al (arXiv:1910.08892).
Specifically, I used the distribution of functions and input variables found in solutions from the Taylor genetic programming model to initialize the prior distribution over functions and input variables for the Bayesian symbolic regression model.
My slides can be found here
A recording of my talk is available here