Jump to ContentJump to Main Navigation
Fundamentals of Machine Learning$
Users without a subscription are not able to see the full content.

Thomas P. Trappenberg

Print publication date: 2019

Print ISBN-13: 9780198828044

Published to Oxford Scholarship Online: January 2020

DOI: 10.1093/oso/9780198828044.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (oxford.universitypressscholarship.com). (c) Copyright Oxford University Press, 2021. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in OSO for personal use. date: 25 July 2021

Probabilistic regression and Bayes nets

Probabilistic regression and Bayes nets

(p.141) 7 Probabilistic regression and Bayes nets
Fundamentals of Machine Learning

Thomas P. Trappenberg

Oxford University Press

This chapter revises regression with the inclusion of uncertainty in the data in probabilistic models. It shows how modern probabilistic machine learning can be formulated. First, a simple stochastic generalization of the linear regression example is offered to introduce the formalism. This leads to the important maximum likelihood principle on which learning will be based. This concept is then generalized to non-linear problems in higher dimensions and the chapter relates this to Bayes nets. The chapter ends with a discussion about how such a probabilistic approach is related to deep learning.

Keywords:   probabilistic models, Bayesian nets, maximum likelihood, map, regularization, causal models

Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us .