Skip to main content Accessibility help
  • Print publication year: 2011
  • Online publication date: September 2011

16 - Bayesian Gaussian process models for multi-sensor time series prediction

from V - Nonparametric models



Sensor networks have recently generated a great deal of research interest within the computer and physical sciences, and their use for the scientific monitoring of remote and hostile environments is increasingly commonplace. While early sensor networks were a simple evolution of existing automated data loggers, that collected data for later offline scientific analysis, more recent sensor networks typically make current data available through the Internet, and thus, are increasingly being used for the real-time monitoring of environmental events such as floods or storm events (see [10] for a review of such environmental sensor networks).

Using real-time sensor data in this manner presents many novel challenges. However, more significantly for us, many of the information processing tasks that would previously have been performed offline by the owner or single user of an environmental sensor network (such as detecting faulty sensors, fusing noisy measurements from several sensors, and deciding how frequently readings should be taken), must now be performed in real-time on the mobile computers and PDAs carried by the multiple different users of the system (who may have different goals and may be using sensor readings for very different tasks). Importantly, it may also be necessary to use the trends and correlations observed in previous data to predict the value of environmental parameters into the future, or to predict the reading of a sensor that is temporarily unavailable (e.g. due to network outages).

[1] P., Abrahamsen. A review of Gaussian random fields and correlation functions. Technical Report 917, Norwegian Computing Center, Box 114, Blindern, N-0314 Oslo, Norway, 1997. 2nd edition.
[2] P., Boyle and M., Frean. Dependent Gaussian processes. In Advances in Neural Information Processing Systems 17, pages 217–224. The MIT Press, 2005.
[3] W., Chu and Z., Ghahramani. Gaussian processes for ordinal regression. Journal of Machine Learning Research, 6(1):1019, 2006.
[4] N. A. C., Cressie. Statistics for Spatial Data. John Wiley & Sons, 1991.
[5] A., Deshpande, C., Guestrin, S., Madden, J., Hellerstein and W., Hong. Model-driven data acquisition in sensor networks. In Proceedings of the Thirtieth International Conference on Very Large Data Bases (VLDB 2004), pages 588–599, 2004.
[6] E., Ertin. Gaussian process models for censored sensor readings. In Statistical Signal Processing, 2007. SSP'07. IEEE/SP 14th Workshop on, pages 665–669, 2007.
[7] M., Fuentes, A., Chaudhuri and D. H., Holland. Bayesian entropy for spatial sampling design of environmental data. Environmental and Ecological Statistics, 14:323–340, 2007.
[8] A., Genz. Numerical computation of multivariate normal probabilities. Journal of Computational and Graphical Statistics, 1(2):141–149, 1992.
[9] A., Girard, C., Rasmussen, J., Candela and R., Murray-Smith. Gaussian process priors with uncertain inputs – application to multiple-step ahead time series forecasting. In Advances in Neural Information Processing Systems 16. MIT Press, 2003.
[10] J. K., Hart and K., Martinez. Environmental Sensor Networks: A revolution in the earth system science?Earth-Science Reviews, 78:177–191, 2006.
[11] A. H., Jazwinski. Stochastic Processes and Filtering Theory. Academic Press, 1970.
[12] A., Kapoor and E., Horvitz. On discarding, caching, and recalling samples in active learning. In Uncertainty in Artificial Intelligence, 2007.
[13] A., Krause, C., Guestrin, A., Gupta and J., Kleinberg. Near-optimal sensor placements: maximizing information while minimizing communication cost. In Proceedings of the Fifth International Conference on Information Processing in Sensor Networks (IPSN '06), pages 2–10, Nashville, Tennessee, USA, 2006.
[14] S. M., Lee and S. J., Roberts. Multivariate time series forecasting in incomplete environments. Technical Report PARG-08-03. Available at∼parg/publications.html. University of Oxford, December 2008.
[15] D. J. C., MacKay. Information-based objective functions for active data selection. Neural Computation, 4(4):590–604, 1992.
[16] D. J. C., MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2002.
[17] A., O'Hagan. Monte Carlo is fundamentally unsound. The Statistician, 36:247–249, 1987.
[18] A., O'Hagan. Bayes-Hermite quadrature. Journal of Statistical Planning and Inference, 29:245–260, 1991.
[19] M., Osborne and S. J., Roberts. Gaussian processes for prediction. Technical Report PARG-07-01. Available at∼parg/publications.html, University of Oxford, September 2007.
[20] J., Pinheiro and D., Bates. Unconstrained parameterizations for variance-covariance matrices. Statistics and Computing, 6:289–296, 1996.
[21] C. E., Rasmussen and Z., Ghahramani. Bayesian Monte Carlo. In Advances in Neural Information Processing Systems 15, pages 489–496. The MIT Press, 2003.
[22] C. E., Rasmussen and C. K. I., Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.
[23] M. J., Sasena. Flexibility and effciency enhancements for constrained global design optimization with Kriging approximations. PhD thesis, University of Michigan, 2002.
[24] S., Seo, M., Wallat, T., Graepel and K., Obermayer. Gaussian process regression: active data selection and test point rejection. In Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on, volume 3, 2000.
[25] M. L., Stein. Space-time covariance functions. Journal of the American Statistical Association, 100(469):310–322, 2005.
[26] Y. W., Teh, M., Seeger and M. I., Jordan. Semiparametric latent factor models. In Proceedings of the Conference on Artificial Intelligence and Statistics, pages 333–340, 2005.
[27] The Math Works. MATLAB R2007a, 2007.