diamondback:Only showing information from the released package extracted on Unknown. No API documentation available. Please see this page for information on how to submit your repository to our index.
electric:Documentation generated on March 05, 2013 at 12:01 PM
fuerte:Documentation generated on December 26, 2012 at 03:34 PM
groovy:Documentation generated on October 06, 2014 at 12:09 AM
This package provides an implementation of Gaussian Process regression. It provides an easy interface to build a GP from input and output data. The GP can then estimate the output at any given input location. Further, a gradient-descent based optimization of the hyperparameter is available.
This library was implemented by Christian Plagemann, Jürgen Hess, Axel Rottmann and Jürgen Sturm at the Autonomous Intelligent Systems Lab. It contains code from Gunter Winkler and Konstantin Kutzkow which implements a LU factorization for boost.
More details on Gaussian Process regression can be found in the open-source book "Gaussian Processes for Machine Learning", written by Carl Edward Rasmussen and Chris Williams, the MIT Press, 2006, available from http://www.gaussianprocess.org/gpml/chapters.
This package provides an implementation of Gaussian Process regression. It provides an easy interface to build a GP from input and output data. The GP can then estimate the output at any given input location. Further, a gradient-descent based optimization of the hyperparameter is available.
This library was implemented by Christian Plagemann, Jürgen Hess, Axel Rottmann and Jürgen Sturm at the Autonomous Intelligent Systems Lab. It contains code from Gunter Winkler and Konstantin Kutzkow which implements a LU factorization for boost.
More details on Gaussian Process regression can be found in the open-source book "Gaussian Processes for Machine Learning", written by Carl Edward Rasmussen and Chris Williams, the MIT Press, 2006, available from http://www.gaussianprocess.org/gpml/chapters.
This package provides an implementation of Gaussian Process regression. It provides an easy interface to build a GP from input and output data. The GP can then estimate the output at any given input location. Further, a gradient-descent based optimization of the hyperparameter is available.
This library was implemented by Christian Plagemann, Jürgen Hess, Axel Rottmann and Jürgen Sturm at the Autonomous Intelligent Systems Lab. It contains code from Gunter Winkler and Konstantin Kutzkow which implements a LU factorization for boost.
More details on Gaussian Process regression can be found in the open-source book "Gaussian Processes for Machine Learning", written by Carl Edward Rasmussen and Chris Williams, the MIT Press, 2006, available from http://www.gaussianprocess.org/gpml/chapters.
This package provides an implementation of Gaussian Process regression. It provides an easy interface to build a GP from input and output data. The GP can then estimate the output at any given input location. Further, a gradient-descent based optimization of the hyperparameter is available.
This library was implemented by Christian Plagemann, Jürgen Hess, Axel Rottmann and Jürgen Sturm at the Autonomous Intelligent Systems Lab. It contains code from Gunter Winkler and Konstantin Kutzkow which implements a LU factorization for boost.
More details on Gaussian Process regression can be found in the open-source book "Gaussian Processes for Machine Learning", written by Carl Edward Rasmussen and Chris Williams, the MIT Press, 2006, available from http://www.gaussianprocess.org/gpml/chapters.