Jump to ContentJump to Main Navigation
Fundamentals of Machine Learning$
Users without a subscription are not able to see the full content.

Thomas P. Trappenberg

Print publication date: 2019

Print ISBN-13: 9780198828044

Published to Oxford Scholarship Online: January 2020

DOI: 10.1093/oso/9780198828044.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (oxford.universitypressscholarship.com). (c) Copyright Oxford University Press, 2021. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in OSO for personal use. date: 05 August 2021

Regression and optimization

Regression and optimization

Chapter:
(p.93) 5 Regression and optimization
Source:
Fundamentals of Machine Learning
Author(s):

Thomas P. Trappenberg

Publisher:
Oxford University Press
DOI:10.1093/oso/9780198828044.003.0005

This chapter returns to the more theoretical embedding of machine learning in regression. Prior chapters have shown that writing machine learning programs is easy using high-level computer languages and with the help of good machine learning libraries. However, applying such algorithms appropriately with superior performance requires considerable experience and a deeper knowledge of the underlying ideas and algorithms. This chapter takes a step back to consider basic regression in more detail, which in turn will form the foundation for discussing probabilistic models in following chapters. This includes the important discussion of gradient descent as a learning algorithm.

Keywords:   linear regression, non-linear regression, gradient decent, backpropagation, automatic differentiation

Oxford Scholarship Online requires a subscription or purchase to access the full text of books within the service. Public users can however freely search the site and view the abstracts and keywords for each book and chapter.

Please, subscribe or login to access full text content.

If you think you should have access to this title, please contact your librarian.

To troubleshoot, please check our FAQs , and if you can't find the answer there, please contact us .