Itzhak Gilboa, Larry Samuelson, and David Schmeidler
- Published in print:
- 2015
- Published Online:
- May 2015
- ISBN:
- 9780198738022
- eISBN:
- 9780191801419
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198738022.001.0001
- Subject:
- Economics and Finance, Econometrics
The book describes formal models of reasoning that are aimed at capturing the way that economic agents and decision makers in general think about their environment and make predictions based on their ...
More
The book describes formal models of reasoning that are aimed at capturing the way that economic agents and decision makers in general think about their environment and make predictions based on their past experience. The focus is on analogies (case-based reasoning) and general theories (rule-based reasoning), and on the interaction between them, as well as between them and Bayesian reasoning. A unified approach allows us to study the dynamics of inductive reasoning in terms of the mode of reasoning that is used to generate predictions.Less
The book describes formal models of reasoning that are aimed at capturing the way that economic agents and decision makers in general think about their environment and make predictions based on their past experience. The focus is on analogies (case-based reasoning) and general theories (rule-based reasoning), and on the interaction between them, as well as between them and Bayesian reasoning. A unified approach allows us to study the dynamics of inductive reasoning in terms of the mode of reasoning that is used to generate predictions.
Franc Klaassen and Jan R. Magnus
- Published in print:
- 2014
- Published Online:
- April 2014
- ISBN:
- 9780199355952
- eISBN:
- 9780199395477
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199355952.001.0001
- Subject:
- Economics and Finance, Econometrics
The game of tennis raises many questions that are of interest to a statistician. Is it true that beginning to serve in a set gives an advantage? Are new balls an advantage? Is the seventh game in a ...
More
The game of tennis raises many questions that are of interest to a statistician. Is it true that beginning to serve in a set gives an advantage? Are new balls an advantage? Is the seventh game in a set particularly important? Are top players more stable than other players? Do real champions win the big points? These, and many other questions, are formulated as ‘hypotheses’ and tested statistically. The book also discusses how the outcome of a match can be predicted (also while the match is in progress), which points are important and which are not, how to choose an optimal service strategy, and whether ‘winning mood’ actually exists in tennis. Aimed at readers with some knowledge of mathematics and statistics, the book uses tennis (Wimbledon in particular) as a vehicle to illustrate the power and beauty of statistical reasoning.Less
The game of tennis raises many questions that are of interest to a statistician. Is it true that beginning to serve in a set gives an advantage? Are new balls an advantage? Is the seventh game in a set particularly important? Are top players more stable than other players? Do real champions win the big points? These, and many other questions, are formulated as ‘hypotheses’ and tested statistically. The book also discusses how the outcome of a match can be predicted (also while the match is in progress), which points are important and which are not, how to choose an optimal service strategy, and whether ‘winning mood’ actually exists in tennis. Aimed at readers with some knowledge of mathematics and statistics, the book uses tennis (Wimbledon in particular) as a vehicle to illustrate the power and beauty of statistical reasoning.
Tomas Björk
- Published in print:
- 2019
- Published Online:
- February 2020
- ISBN:
- 9780198851615
- eISBN:
- 9780191886218
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198851615.001.0001
- Subject:
- Economics and Finance, Econometrics
The fourth edition of this textbook on pricing and hedging of financial derivatives, now also including dynamic equilibrium theory, continues to combine sound mathematical principles with economic ...
More
The fourth edition of this textbook on pricing and hedging of financial derivatives, now also including dynamic equilibrium theory, continues to combine sound mathematical principles with economic applications. Concentrating on the probabilistic theory of continuous time arbitrage pricing of financial derivatives, including stochastic optimal control theory and optimal stopping theory, the book is designed for graduate students in economics and mathematics, and combines the necessary mathematical background with a solid economic focus. It includes a solved example for every new technique presented, contains numerous exercises, and suggests further reading in each chapter. All concepts and ideas are discussed, not only from a mathematics point of view, but the mathematical theory is also always supplemented with lots of intuitive economic arguments. In the substantially extended fourth edition Tomas Björk has added completely new chapters on incomplete markets, treating such topics as the Esscher transform, the minimal martingale measure, f-divergences, optimal investment theory for incomplete markets, and good deal bounds. There is also an entirely new part of the book presenting dynamic equilibrium theory. This includes several chapters on unit net supply endowments models, and the Cox–Ingersoll–Ross equilibrium factor model (including the CIR equilibrium interest rate model). Providing two full treatments of arbitrage theory—the classical delta hedging approach and the modern martingale approach—the book is written in such a way that these approaches can be studied independently of each other, thus providing the less mathematically oriented reader with a self-contained introduction to arbitrage theory and equilibrium theory, while at the same time allowing the more advanced student to see the full theory in action.Less
The fourth edition of this textbook on pricing and hedging of financial derivatives, now also including dynamic equilibrium theory, continues to combine sound mathematical principles with economic applications. Concentrating on the probabilistic theory of continuous time arbitrage pricing of financial derivatives, including stochastic optimal control theory and optimal stopping theory, the book is designed for graduate students in economics and mathematics, and combines the necessary mathematical background with a solid economic focus. It includes a solved example for every new technique presented, contains numerous exercises, and suggests further reading in each chapter. All concepts and ideas are discussed, not only from a mathematics point of view, but the mathematical theory is also always supplemented with lots of intuitive economic arguments. In the substantially extended fourth edition Tomas Björk has added completely new chapters on incomplete markets, treating such topics as the Esscher transform, the minimal martingale measure, f-divergences, optimal investment theory for incomplete markets, and good deal bounds. There is also an entirely new part of the book presenting dynamic equilibrium theory. This includes several chapters on unit net supply endowments models, and the Cox–Ingersoll–Ross equilibrium factor model (including the CIR equilibrium interest rate model). Providing two full treatments of arbitrage theory—the classical delta hedging approach and the modern martingale approach—the book is written in such a way that these approaches can be studied independently of each other, thus providing the less mathematically oriented reader with a self-contained introduction to arbitrage theory and equilibrium theory, while at the same time allowing the more advanced student to see the full theory in action.
Luc Bauwens, Michel Lubrano, and Jean-François Richard
- Published in print:
- 2000
- Published Online:
- September 2011
- ISBN:
- 9780198773122
- eISBN:
- 9780191695315
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198773122.001.0001
- Subject:
- Economics and Finance, Econometrics
This book contains an up-to-date coverage of the last twenty years of advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in ...
More
This book contains an up-to-date coverage of the last twenty years of advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non-linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non-linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods.Less
This book contains an up-to-date coverage of the last twenty years of advances in Bayesian inference in econometrics, with an emphasis on dynamic models. It shows how to treat Bayesian inference in non-linear models, by integrating the useful developments of numerical integration techniques based on simulations (such as Markov Chain Monte Carlo methods), and the long available analytical results of Bayesian inference for linear regression models. It thus covers a broad range of rather recent models for economic time series, such as non-linear models, autoregressive conditional heteroskedastic regressions, and cointegrated vector autoregressive models. It contains also an extensive chapter on unit root inference from the Bayesian viewpoint. Several examples illustrate the methods.
Anindya Banerjee, Juan J. Dolado, John W. Galbraith, and David Hendry
- Published in print:
- 1993
- Published Online:
- November 2003
- ISBN:
- 9780198288107
- eISBN:
- 9780191595899
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198288107.001.0001
- Subject:
- Economics and Finance, Econometrics
This book considers the econometric analysis of both stationary and non‐stationary processes, which may be linked by equilibrium relationships. It provides a wide‐ranging account of the main tools, ...
More
This book considers the econometric analysis of both stationary and non‐stationary processes, which may be linked by equilibrium relationships. It provides a wide‐ranging account of the main tools, techniques, models, concepts, and distributions involved in the modelling of integrated processes (i.e. those that accumulate the effects of past shocks). Since the focus is on equilibrium concepts, including co‐integration and error‐correction, the analysis begins with a discussion of the application of these concepts to stationary empirical models. Later chapters show how integrated processes can be reduced to this case by suitable transformations that take advantage of co‐integrating (equilibrium) relationships. The concepts of co‐integration and error‐correction models are shown to be fundamental in this modelling strategy. Practical modelling advice and empirical illustrations are provided.Knowledge of econometrics, statistics, and matrix algebra at the level of a final‐year undergraduate or first‐year graduate course in econometrics is sufficient for most of the book. Other mathematical tools are described as they arise.Less
This book considers the econometric analysis of both stationary and non‐stationary processes, which may be linked by equilibrium relationships. It provides a wide‐ranging account of the main tools, techniques, models, concepts, and distributions involved in the modelling of integrated processes (i.e. those that accumulate the effects of past shocks). Since the focus is on equilibrium concepts, including co‐integration and error‐correction, the analysis begins with a discussion of the application of these concepts to stationary empirical models. Later chapters show how integrated processes can be reduced to this case by suitable transformations that take advantage of co‐integrating (equilibrium) relationships. The concepts of co‐integration and error‐correction models are shown to be fundamental in this modelling strategy. Practical modelling advice and empirical illustrations are provided.
Knowledge of econometrics, statistics, and matrix algebra at the level of a final‐year undergraduate or first‐year graduate course in econometrics is sufficient for most of the book. Other mathematical tools are described as they arise.
Lawrence R. Klein (ed.)
- Published in print:
- 1991
- Published Online:
- October 2011
- ISBN:
- 9780195057720
- eISBN:
- 9780199854967
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195057720.001.0001
- Subject:
- Economics and Finance, Econometrics
One of the most important, and visible, things economists do is to forecast what will happen in the economy in the future. Each year, a number of different groups in the United States use their own ...
More
One of the most important, and visible, things economists do is to forecast what will happen in the economy in the future. Each year, a number of different groups in the United States use their own econometric models to forecast what will happen to the economy in the coming year. Some economic forecasts are more accurate than others. This book consists of chapters comparing the different models now being used. It is organized topically rather than by model. The contributors include: Roger Brimmer, Ray Fair, Bert Hickman, F. Gerard Adams, and Albert Ando. The editor provides an introduction to the volume.Less
One of the most important, and visible, things economists do is to forecast what will happen in the economy in the future. Each year, a number of different groups in the United States use their own econometric models to forecast what will happen to the economy in the coming year. Some economic forecasts are more accurate than others. This book consists of chapters comparing the different models now being used. It is organized topically rather than by model. The contributors include: Roger Brimmer, Ray Fair, Bert Hickman, F. Gerard Adams, and Albert Ando. The editor provides an introduction to the volume.
Robert G. Chambers
- Published in print:
- 2021
- Published Online:
- December 2020
- ISBN:
- 9780190063016
- eISBN:
- 9780190063047
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780190063016.001.0001
- Subject:
- Economics and Finance, Econometrics, Microeconomics
This book uses concepts from optimization theory to develop an integrated analytic framework for treating consumer, producer, and market equilibrium analyses as special cases of a generic ...
More
This book uses concepts from optimization theory to develop an integrated analytic framework for treating consumer, producer, and market equilibrium analyses as special cases of a generic optimization problem. The same framework applies to both stochastic and non-stochastic decision settings, so that the latter is recognized as an (important) special case of the former. The analytic techniques are borrowed from convex analysis and variational analysis. Special emphasis is given to generalized notions of differentiability, conjugacy theory, and Fenchel's Duality Theorem. The book shows how virtually identical conjugate analyses form the basis for modeling economic behavior in each of the areas studied. The basic analytic concepts are borrowed from convex analysis. Special emphasis is given to generalized notions of differentiability, conjugacy theory, and Fenchel's Duality Theorem. It is demonstrated how virtually identical conjugate analyses form the basis for modelling economic behaviour in each of the areas studied.Less
This book uses concepts from optimization theory to develop an integrated analytic framework for treating consumer, producer, and market equilibrium analyses as special cases of a generic optimization problem. The same framework applies to both stochastic and non-stochastic decision settings, so that the latter is recognized as an (important) special case of the former. The analytic techniques are borrowed from convex analysis and variational analysis. Special emphasis is given to generalized notions of differentiability, conjugacy theory, and Fenchel's Duality Theorem. The book shows how virtually identical conjugate analyses form the basis for modeling economic behavior in each of the areas studied. The basic analytic concepts are borrowed from convex analysis. Special emphasis is given to generalized notions of differentiability, conjugacy theory, and Fenchel's Duality Theorem. It is demonstrated how virtually identical conjugate analyses form the basis for modelling economic behaviour in each of the areas studied.
David F. Hendry
- Published in print:
- 1995
- Published Online:
- November 2003
- ISBN:
- 9780198283164
- eISBN:
- 9780191596384
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198283164.001.0001
- Subject:
- Economics and Finance, Econometrics
This systematic and integrated framework for econometric modelling is organized in terms of three levels of knowledge: probability, estimation, and modelling. All necessary concepts of econometrics ...
More
This systematic and integrated framework for econometric modelling is organized in terms of three levels of knowledge: probability, estimation, and modelling. All necessary concepts of econometrics (including exogeneity and encompassing), models, processes, estimators, and inference procedures (centred on maximum likelihood) are discussed with solved examples and exercises. Practical problems in empirical modelling, such as model discovery, evaluation, and data mining are addressed, and illustrated using the software system PcGive. Background analyses cover matrix algebra, probability theory, multiple regression, stationary and non‐stationary stochastic processes, asymptotic distribution theory, Monte Carlo methods, numerical optimization, and macro‐econometric models. The reader will master the theory and practice of modelling non‐stationary (cointegrated) economic time series, based on a rigorous theory of reduction.Less
This systematic and integrated framework for econometric modelling is organized in terms of three levels of knowledge: probability, estimation, and modelling. All necessary concepts of econometrics (including exogeneity and encompassing), models, processes, estimators, and inference procedures (centred on maximum likelihood) are discussed with solved examples and exercises. Practical problems in empirical modelling, such as model discovery, evaluation, and data mining are addressed, and illustrated using the software system PcGive. Background analyses cover matrix algebra, probability theory, multiple regression, stationary and non‐stationary stochastic processes, asymptotic distribution theory, Monte Carlo methods, numerical optimization, and macro‐econometric models. The reader will master the theory and practice of modelling non‐stationary (cointegrated) economic time series, based on a rigorous theory of reduction.
Stephen Bazen
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199576791
- eISBN:
- 9780191731136
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199576791.001.0001
- Subject:
- Economics and Finance, Econometrics
This book provides a presentation of the standard statistical techniques used by labour economists. It emphasizes both the input and the output of empirical analysis and covers five major topics ...
More
This book provides a presentation of the standard statistical techniques used by labour economists. It emphasizes both the input and the output of empirical analysis and covers five major topics concerning econometric methods used in labour economics: regression and related methods, choice modelling, selectivity issues, duration analysis, and policy evaluation techniques. Each of these is presented in terms of model specification, possible estimation problems, diagnostic checking, and interpretation of the output. The book aims to provide guidance to practitioners on how to use the techniques and how to make sense of the results that are produced. It covers methods that are considered to be ‘standard’ tools in labour economics, but which are often given only a brief and highly technical treatment in econometrics textbooks.Less
This book provides a presentation of the standard statistical techniques used by labour economists. It emphasizes both the input and the output of empirical analysis and covers five major topics concerning econometric methods used in labour economics: regression and related methods, choice modelling, selectivity issues, duration analysis, and policy evaluation techniques. Each of these is presented in terms of model specification, possible estimation problems, diagnostic checking, and interpretation of the output. The book aims to provide guidance to practitioners on how to use the techniques and how to make sense of the results that are produced. It covers methods that are considered to be ‘standard’ tools in labour economics, but which are often given only a brief and highly technical treatment in econometrics textbooks.
Erik Biørn
- Published in print:
- 2016
- Published Online:
- December 2016
- ISBN:
- 9780198753445
- eISBN:
- 9780191815072
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198753445.001.0001
- Subject:
- Economics and Finance, Econometrics
Panel data is a data type increasingly used in research in economics, social sciences, and medicine. Its primary characteristic is that the data variation goes jointly over space (e.g. individuals, ...
More
Panel data is a data type increasingly used in research in economics, social sciences, and medicine. Its primary characteristic is that the data variation goes jointly over space (e.g. individuals, firms, countries) and time (e.g. years, months). Panel data allow examination of problems which cannot be handled by cross-section data or time-series data. Panel data analysis is a core field in modern econometrics and multivariate statistics, and studies based on such data occupy a growing part of the field in many other disciplines. The book is intended as a text for master’s/advanced undergraduate courses. It may also be useful for PhD students writing theses in empirical/applied economics and readers doing empirical work on their own. The book attempts to take the reader gradually from simple models and methods in scalar (simple vector) notation to more complex models in matrix notation. Compared to related texts, a distinctive feature is that relatively more attention is given to unbalanced panel data, the measurement error problem, random coefficient approaches, the interface between panel data and aggregation, and the interface between unbalanced panels and truncated and censored data sets. The 12 chapters are intended to be largely self-contained, although there is a natural progression. Most chapters contain commented examples based on genuine data, mainly taken from panel data applications to economics. Although the book, inter alia, through its use of examples, aims primarily at students of economics/econometrics, it may be useful also for readers in social sciences outside economics and in psychology and medicine, provided they have a sufficient background in statistics, notably basic regression analysis and elementary linear algebra.Less
Panel data is a data type increasingly used in research in economics, social sciences, and medicine. Its primary characteristic is that the data variation goes jointly over space (e.g. individuals, firms, countries) and time (e.g. years, months). Panel data allow examination of problems which cannot be handled by cross-section data or time-series data. Panel data analysis is a core field in modern econometrics and multivariate statistics, and studies based on such data occupy a growing part of the field in many other disciplines. The book is intended as a text for master’s/advanced undergraduate courses. It may also be useful for PhD students writing theses in empirical/applied economics and readers doing empirical work on their own. The book attempts to take the reader gradually from simple models and methods in scalar (simple vector) notation to more complex models in matrix notation. Compared to related texts, a distinctive feature is that relatively more attention is given to unbalanced panel data, the measurement error problem, random coefficient approaches, the interface between panel data and aggregation, and the interface between unbalanced panels and truncated and censored data sets. The 12 chapters are intended to be largely self-contained, although there is a natural progression. Most chapters contain commented examples based on genuine data, mainly taken from panel data applications to economics. Although the book, inter alia, through its use of examples, aims primarily at students of economics/econometrics, it may be useful also for readers in social sciences outside economics and in psychology and medicine, provided they have a sufficient background in statistics, notably basic regression analysis and elementary linear algebra.
David F. Hendry
- Published in print:
- 2000
- Published Online:
- November 2003
- ISBN:
- 9780198293545
- eISBN:
- 9780191596391
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198293542.001.0001
- Subject:
- Economics and Finance, Econometrics
This collection of published papers records the development of an approach to econometric modelling that has reached a highly successful stage. The methodology of modelling ‘observational data’, as ...
More
This collection of published papers records the development of an approach to econometric modelling that has reached a highly successful stage. The methodology of modelling ‘observational data’, as opposed to experimental data, which can be replicated, is analysed to highlight the fundamental flaws in various approaches, and the possibilities of others. Criteria for model adequacy are formulated (congruence and encompassing), and alternative approaches to building empirical models are compared on their ability to deliver such models. A typology of models elucidates their properties, and a taxonomy of information sources clarifies testing. Estimation is summarized by an estimator generating equation. The value of exploring the development path is to reveal by attempted applications why many widely used approaches are inadequate. The outcome is to demonstrate the viability of a general‐to‐specific approach that commences from a specification deemed more than adequate to characterize the evidence, and simplifies to a parsimonious representation that captures the main factors. By artificial Monte Carlo simulations on experiments designed by others, the success of that approach is established, leading to automatic model selection by software that can outperform practitioners.Less
This collection of published papers records the development of an approach to econometric modelling that has reached a highly successful stage. The methodology of modelling ‘observational data’, as opposed to experimental data, which can be replicated, is analysed to highlight the fundamental flaws in various approaches, and the possibilities of others. Criteria for model adequacy are formulated (congruence and encompassing), and alternative approaches to building empirical models are compared on their ability to deliver such models. A typology of models elucidates their properties, and a taxonomy of information sources clarifies testing. Estimation is summarized by an estimator generating equation. The value of exploring the development path is to reveal by attempted applications why many widely used approaches are inadequate. The outcome is to demonstrate the viability of a general‐to‐specific approach that commences from a specification deemed more than adequate to characterize the evidence, and simplifies to a parsimonious representation that captures the main factors. By artificial Monte Carlo simulations on experiments designed by others, the success of that approach is established, leading to automatic model selection by software that can outperform practitioners.
Niels Haldrup, Mika Meitz, and Pentti Saikkonen (eds)
- Published in print:
- 2014
- Published Online:
- August 2014
- ISBN:
- 9780199679959
- eISBN:
- 9780191760136
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199679959.001.0001
- Subject:
- Economics and Finance, Econometrics
This book is a collection of 14 original research articles presented at the conference Nonlinear Time Series Econometrics that was held in Ebeltoft, Denmark, in June 2012. The conference gathered ...
More
This book is a collection of 14 original research articles presented at the conference Nonlinear Time Series Econometrics that was held in Ebeltoft, Denmark, in June 2012. The conference gathered several eminent time series econometricians to celebrate the work and outstanding career of Professor Timo Teräsvirta, one of the leading scholars in the field of nonlinear time series econometrics. The book is divided into four broad themes that all reflect Timo Teräsvirta’s work and methodology: testing for linearity and functional form, specification testing and estimation of nonlinear time series models in the form of smooth transition models, model selection and econometric methodology, and finally applications within the area of financial econometrics. All these research fields include contributions that represent the state of the art in econometrics, such as testing for neglected nonlinearity in neural network models, time-varying GARCH and smooth transition models, STAR models and common factors in volatility modeling, semi-automatic general to specific model selection for nonlinear dynamic models, high-dimensional data analysis for parametric and semi-parametric regression models with dependent data, commodity price modeling, financial analysts earnings forecasts based on asymmetric loss function, local Gaussian correlation and dependence for asymmetric return dependence, and the use of bootstrap aggregation to improve forecast accuracy. Each chapter represents original scholarly work, and reflects the intellectual impact that Timo Teräsvirta has had, and will continue to have, on the profession.Less
This book is a collection of 14 original research articles presented at the conference Nonlinear Time Series Econometrics that was held in Ebeltoft, Denmark, in June 2012. The conference gathered several eminent time series econometricians to celebrate the work and outstanding career of Professor Timo Teräsvirta, one of the leading scholars in the field of nonlinear time series econometrics. The book is divided into four broad themes that all reflect Timo Teräsvirta’s work and methodology: testing for linearity and functional form, specification testing and estimation of nonlinear time series models in the form of smooth transition models, model selection and econometric methodology, and finally applications within the area of financial econometrics. All these research fields include contributions that represent the state of the art in econometrics, such as testing for neglected nonlinearity in neural network models, time-varying GARCH and smooth transition models, STAR models and common factors in volatility modeling, semi-automatic general to specific model selection for nonlinear dynamic models, high-dimensional data analysis for parametric and semi-parametric regression models with dependent data, commodity price modeling, financial analysts earnings forecasts based on asymmetric loss function, local Gaussian correlation and dependence for asymmetric return dependence, and the use of bootstrap aggregation to improve forecast accuracy. Each chapter represents original scholarly work, and reflects the intellectual impact that Timo Teräsvirta has had, and will continue to have, on the profession.
Claus Munk
- Published in print:
- 2013
- Published Online:
- May 2013
- ISBN:
- 9780199585496
- eISBN:
- 9780191751790
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199585496.001.0001
- Subject:
- Economics and Finance, Econometrics
“Financial Asset Pricing Theory” offers a comprehensive overview of the classic and the current research in theoretical asset pricing. Asset pricing is developed around the concept of a state-price ...
More
“Financial Asset Pricing Theory” offers a comprehensive overview of the classic and the current research in theoretical asset pricing. Asset pricing is developed around the concept of a state-price deflator which relates the price of any asset to its future (risky) dividends and thus incorporates how to adjust for both time and risk in asset valuation. The willingness of any utility-maximizing investor to shift consumption over time defines a state-price deflator which provides a link between optimal consumption and asset prices that leads to the Consumption-based Capital Asset Pricing Model (CCAPM). A simple version of the CCAPM cannot explain various stylized asset pricing facts, but these asset pricing “puzzles” can be resolved by a number of recent extensions involving habit formation, recursive utility, multiple consumption goods, and long-run consumption risks. Other valuation techniques and modelling approaches (such as factor models, term structure models, risk-neutral valuation, and option pricing models) are explained and related to state-price deflators. The book will serve as a textbook for an advanced course in theoretical financial economics in a PhD or a quantitative Master of Science program. It will also be a useful reference book for researchers and finance professionals. The presentation in the book balances formal mathematical modelling and economic intuition and understanding. Both discrete-time and continuous-time models are covered. The necessary concepts and techniques concerning stochastic processes are carefully explained in a separate chapter so that only limited previous exposure to dynamic finance models is required.Less
“Financial Asset Pricing Theory” offers a comprehensive overview of the classic and the current research in theoretical asset pricing. Asset pricing is developed around the concept of a state-price deflator which relates the price of any asset to its future (risky) dividends and thus incorporates how to adjust for both time and risk in asset valuation. The willingness of any utility-maximizing investor to shift consumption over time defines a state-price deflator which provides a link between optimal consumption and asset prices that leads to the Consumption-based Capital Asset Pricing Model (CCAPM). A simple version of the CCAPM cannot explain various stylized asset pricing facts, but these asset pricing “puzzles” can be resolved by a number of recent extensions involving habit formation, recursive utility, multiple consumption goods, and long-run consumption risks. Other valuation techniques and modelling approaches (such as factor models, term structure models, risk-neutral valuation, and option pricing models) are explained and related to state-price deflators. The book will serve as a textbook for an advanced course in theoretical financial economics in a PhD or a quantitative Master of Science program. It will also be a useful reference book for researchers and finance professionals. The presentation in the book balances formal mathematical modelling and economic intuition and understanding. Both discrete-time and continuous-time models are covered. The necessary concepts and techniques concerning stochastic processes are carefully explained in a separate chapter so that only limited previous exposure to dynamic finance models is required.
Aman Ullah
- Published in print:
- 2004
- Published Online:
- August 2004
- ISBN:
- 9780198774471
- eISBN:
- 9780191601347
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198774478.001.0001
- Subject:
- Economics and Finance, Econometrics
This book presents a comprehensive and unified treatment of finite sample theory, and its application to estimators and test statistics used in various econometric models. Time series, cross section, ...
More
This book presents a comprehensive and unified treatment of finite sample theory, and its application to estimators and test statistics used in various econometric models. Time series, cross section, and panel data models are considered. The results are explored for linear and nonlinear models, as well as models with normal and nonnormal errors. The book contains seven chapters. Chapter 1 presents an introduction to finite sample econometrics. Chapter 2 gives methods of obtaining the moments of econometric statistics. Chapter 3 provides methods for analysing distributions. Finite sample results for various econometric models are discussed in Chapters 4-7.Less
This book presents a comprehensive and unified treatment of finite sample theory, and its application to estimators and test statistics used in various econometric models. Time series, cross section, and panel data models are considered. The results are explored for linear and nonlinear models, as well as models with normal and nonnormal errors. The book contains seven chapters. Chapter 1 presents an introduction to finite sample econometrics. Chapter 2 gives methods of obtaining the moments of econometric statistics. Chapter 3 provides methods for analysing distributions. Finite sample results for various econometric models are discussed in Chapters 4-7.
Steve McCorriston (ed.)
- Published in print:
- 2015
- Published Online:
- November 2015
- ISBN:
- 9780198732396
- eISBN:
- 9780191796685
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198732396.001.0001
- Subject:
- Economics and Finance, Macro- and Monetary Economics, Econometrics
This book addresses the important issue of food prices across EU Member States. Although recent attention has focused on events on world commodity markets following the spikes in world prices in ...
More
This book addresses the important issue of food prices across EU Member States. Although recent attention has focused on events on world commodity markets following the spikes in world prices in 2007–08 and 2011, comparatively little attention has been focused on food price dynamics at the retail level. The contributions to this book address the characteristics of retail food price behaviour and the nature and drivers of price transmission across the EU. First, it reports the characteristics of retail food inflation across the EU and the extent to which it differs from non-food inflation. Second, given the different experience of food inflation across EU Member States, the book details the process of price transmission as shocks from upstream and world markets are passed through the food sector to the retail stage. Third, it addresses how price transmission is determined by various aspects of competition throughout the domestic food sector and how the nature of vertical contracting can determine the process. Finally, the book outlines the potential of high-frequency, product-specific scanner data to address price dynamics and adjustment issues and to measure food price inflation. The research reported here should be of interest to researchers on price transmission and competition issues in the EU and further afield, as well as to policymakers and stakeholders as they seek to make sense of and address regulation issues related to the food sector.Less
This book addresses the important issue of food prices across EU Member States. Although recent attention has focused on events on world commodity markets following the spikes in world prices in 2007–08 and 2011, comparatively little attention has been focused on food price dynamics at the retail level. The contributions to this book address the characteristics of retail food price behaviour and the nature and drivers of price transmission across the EU. First, it reports the characteristics of retail food inflation across the EU and the extent to which it differs from non-food inflation. Second, given the different experience of food inflation across EU Member States, the book details the process of price transmission as shocks from upstream and world markets are passed through the food sector to the retail stage. Third, it addresses how price transmission is determined by various aspects of competition throughout the domestic food sector and how the nature of vertical contracting can determine the process. Finally, the book outlines the potential of high-frequency, product-specific scanner data to address price dynamics and adjustment issues and to measure food price inflation. The research reported here should be of interest to researchers on price transmission and competition issues in the EU and further afield, as well as to policymakers and stakeholders as they seek to make sense of and address regulation issues related to the food sector.
Duo Qin
- Published in print:
- 1997
- Published Online:
- November 2003
- ISBN:
- 9780198292876
- eISBN:
- 9780191596803
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0198292872.001.0001
- Subject:
- Economics and Finance, History of Economic Thought, Econometrics
This book traces the formation of econometric theory during the period 1930–1960. It focuses upon the process of how econometrics was formed from mathematical and scientific processes in order to ...
More
This book traces the formation of econometric theory during the period 1930–1960. It focuses upon the process of how econometrics was formed from mathematical and scientific processes in order to analyse economic problems. The book deals with the advances that were achieved as well as the problems that arose in the course of the practice of econometrics as a discipline. Duo Qin examines the history of econometrics in terms of the basic issues in econometric modelling: the probability foundations, estimation, identification, testing, and model construction and specification. The book describes chronologically how these issues were formalized. Duo Qin argues that, while the probability revolution in econometrics in the early 1940s laid the basis for the systematization of econometric theory, it was actually an incomplete revolution, and its incompleteness underlay various problems and failures that occurred in applying the newly established theory to modelling practice. Model construction and hypothesis testing remained problematic because the basic problem of induction in econometrics was not properly formalized and solved. The book thus links early econometric history with many issues of interest to contemporary developments in econometrics. The story is told from the econometric perspective instead of the usual perspective in the history of economic thought (i.e. presenting the story either according to different schools or economic issues), and this approach is clearly reflected in the classification of the chapters.Less
This book traces the formation of econometric theory during the period 1930–1960. It focuses upon the process of how econometrics was formed from mathematical and scientific processes in order to analyse economic problems. The book deals with the advances that were achieved as well as the problems that arose in the course of the practice of econometrics as a discipline. Duo Qin examines the history of econometrics in terms of the basic issues in econometric modelling: the probability foundations, estimation, identification, testing, and model construction and specification. The book describes chronologically how these issues were formalized. Duo Qin argues that, while the probability revolution in econometrics in the early 1940s laid the basis for the systematization of econometric theory, it was actually an incomplete revolution, and its incompleteness underlay various problems and failures that occurred in applying the newly established theory to modelling practice. Model construction and hypothesis testing remained problematic because the basic problem of induction in econometrics was not properly formalized and solved. The book thus links early econometric history with many issues of interest to contemporary developments in econometrics. The story is told from the econometric perspective instead of the usual perspective in the history of economic thought (i.e. presenting the story either according to different schools or economic issues), and this approach is clearly reflected in the classification of the chapters.
Amos Golan
- Published in print:
- 2017
- Published Online:
- November 2017
- ISBN:
- 9780199349524
- eISBN:
- 9780199349555
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780199349524.001.0001
- Subject:
- Economics and Finance, Econometrics
This book provides a framework for info-metrics—the science of modeling, inference, and reasoning under conditions of noisy and insufficient information. Info-metrics is an inherently ...
More
This book provides a framework for info-metrics—the science of modeling, inference, and reasoning under conditions of noisy and insufficient information. Info-metrics is an inherently interdisciplinary framework that emerged from the intersection of information theory, statistical inference, and decision-making under uncertainty. It allows us to process the available information with minimal reliance on assumptions that cannot be validated. This book focuses on unifying all information processing and model building within a single constrained optimization framework. It provides a complete framework for modeling and inference, rather than a problem-specific model. The framework evolves from the simple premise that our available information is often insufficient to provide a unique answer for decisions we wish to make. Each decision, or solution, is derived from the available input information along with a choice of inferential procedure. The book contains many multidisciplinary applications that demonstrate the simplicity and generality of the framework in real-world settings: These include initial diagnosis at an emergency room, optimal dose decisions, election forecasting, network and information aggregation, weather pattern analyses, portfolio allocation, inference of strategic behavior, incorporation of prior information, option pricing, and modeling an interacting social system. This book presents simple derivations of the key results that are necessary to understand and apply the fundamental concepts to a variety of problems. Derivations are often supported by graphical illustrations. The book is designed to be accessible for graduate students, researchers, and practitioners across the disciplines, requiring only basic quantitative skills and a little persistence.Less
This book provides a framework for info-metrics—the science of modeling, inference, and reasoning under conditions of noisy and insufficient information. Info-metrics is an inherently interdisciplinary framework that emerged from the intersection of information theory, statistical inference, and decision-making under uncertainty. It allows us to process the available information with minimal reliance on assumptions that cannot be validated. This book focuses on unifying all information processing and model building within a single constrained optimization framework. It provides a complete framework for modeling and inference, rather than a problem-specific model. The framework evolves from the simple premise that our available information is often insufficient to provide a unique answer for decisions we wish to make. Each decision, or solution, is derived from the available input information along with a choice of inferential procedure. The book contains many multidisciplinary applications that demonstrate the simplicity and generality of the framework in real-world settings: These include initial diagnosis at an emergency room, optimal dose decisions, election forecasting, network and information aggregation, weather pattern analyses, portfolio allocation, inference of strategic behavior, incorporation of prior information, option pricing, and modeling an interacting social system. This book presents simple derivations of the key results that are necessary to understand and apply the fundamental concepts to a variety of problems. Derivations are often supported by graphical illustrations. The book is designed to be accessible for graduate students, researchers, and practitioners across the disciplines, requiring only basic quantitative skills and a little persistence.
Anthony Garratt, Kevin Lee, M. Hashem Pesaran, and Yongcheol Shin
- Published in print:
- 2006
- Published Online:
- September 2006
- ISBN:
- 9780199296859
- eISBN:
- 9780191603853
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/0199296855.001.0001
- Subject:
- Economics and Finance, Econometrics
This book provides a comprehensive description of the state-of-the-art in macroeconometric modelling and describes the ‘long-run structural modelling approach’ applied to the modelling of national ...
More
This book provides a comprehensive description of the state-of-the-art in macroeconometric modelling and describes the ‘long-run structural modelling approach’ applied to the modelling of national economies in a global context. The first part of the book discusses the ways in which economic theory and econometric analysis can be brought together to construct a macroeconometric model, in which the long-run relationships are consistent with economic theory and where the short-run dynamics have an interpretation. The discussion considers theoretical as well as practical considerations involved in the model building process, and gives an overview of the econometric methods covering cointegrating VAR analysis and probability forecasting. The second part of the book is devoted to the practical detail of estimating a long-run structural macroeconometric model and is illustrated through various global and national examples, including a step-by-step description of the development of a model of the UK economy. The third part discusses the interpretation and use of long-run structural macroeconometric models, describing the use of the UK model along with illustrations of the modelling approach in investigating regional interdependencies in a global macroeconometric model and other specified issues in a global or national macroeconometric context. Throughout, the book emphasizes the use of macroeconometric modelling in the real world and provides sufficient detail, including discussion of data collection and computer programmes employed, for the techniques that are introduced to be replicated or applied in new contexts.Less
This book provides a comprehensive description of the state-of-the-art in macroeconometric modelling and describes the ‘long-run structural modelling approach’ applied to the modelling of national economies in a global context. The first part of the book discusses the ways in which economic theory and econometric analysis can be brought together to construct a macroeconometric model, in which the long-run relationships are consistent with economic theory and where the short-run dynamics have an interpretation. The discussion considers theoretical as well as practical considerations involved in the model building process, and gives an overview of the econometric methods covering cointegrating VAR analysis and probability forecasting. The second part of the book is devoted to the practical detail of estimating a long-run structural macroeconometric model and is illustrated through various global and national examples, including a step-by-step description of the development of a model of the UK economy. The third part discusses the interpretation and use of long-run structural macroeconometric models, describing the use of the UK model along with illustrations of the modelling approach in investigating regional interdependencies in a global macroeconometric model and other specified issues in a global or national macroeconometric context. Throughout, the book emphasizes the use of macroeconometric modelling in the real world and provides sufficient detail, including discussion of data collection and computer programmes employed, for the techniques that are introduced to be replicated or applied in new contexts.
Duo Qin
- Published in print:
- 2013
- Published Online:
- September 2013
- ISBN:
- 9780199679348
- eISBN:
- 9780191758416
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199679348.001.0001
- Subject:
- Economics and Finance, Econometrics, History of Economic Thought
This book is a sequel of Qin's previous OUP volume The Formation of Econometrics: A Historical Perspective. That book traces the history roughly during the period 1930-1960. The present book is ...
More
This book is a sequel of Qin's previous OUP volume The Formation of Econometrics: A Historical Perspective. That book traces the history roughly during the period 1930-1960. The present book is focused on the reformists’ movements mainly during the 1970s and 1980s. After a background description of the formation and consolidation of the Cowles Commission (CC) paradigm, it traces and analyses the three major methodological attempts to resolve problems involved in model choice and specification of the CC paradigm. These attempts have reoriented the focus of econometric research from internal questions (how to optimally estimate a priori given structural parameters) to external questions (how to choose, design, and specify models). Next, it examines various modelling issues and problems through two case studies – modelling the Phillips curve and business cycles. The third part of the book delves into the development of three key aspects of model specification in details – structural parameters, error terms, and model selection and design procedures. The final chapter uses citation analyses to study the impact of the CC paradigm over the span of three and half decades (1970-2005). The citation statistics show that the impact has remained extensive and relatively strong in spite of certain weakening signs. It implies that the reformative attempts have fallen short of causing a paradigm shift.Less
This book is a sequel of Qin's previous OUP volume The Formation of Econometrics: A Historical Perspective. That book traces the history roughly during the period 1930-1960. The present book is focused on the reformists’ movements mainly during the 1970s and 1980s. After a background description of the formation and consolidation of the Cowles Commission (CC) paradigm, it traces and analyses the three major methodological attempts to resolve problems involved in model choice and specification of the CC paradigm. These attempts have reoriented the focus of econometric research from internal questions (how to optimally estimate a priori given structural parameters) to external questions (how to choose, design, and specify models). Next, it examines various modelling issues and problems through two case studies – modelling the Phillips curve and business cycles. The third part of the book delves into the development of three key aspects of model specification in details – structural parameters, error terms, and model selection and design procedures. The final chapter uses citation analyses to study the impact of the CC paradigm over the span of three and half decades (1970-2005). The citation statistics show that the impact has remained extensive and relatively strong in spite of certain weakening signs. It implies that the reformative attempts have fallen short of causing a paradigm shift.
Sydney Afriat
- Published in print:
- 2014
- Published Online:
- April 2014
- ISBN:
- 9780199670581
- eISBN:
- 9780191773785
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199670581.001.0001
- Subject:
- Economics and Finance, Econometrics, Microeconomics
A theft amounting to £1 was a capital offence in 1260 and a judge in 1610 affirmed the law could not then be applied since £1 was no longer what it was. Such association of money with a date is well ...
More
A theft amounting to £1 was a capital offence in 1260 and a judge in 1610 affirmed the law could not then be applied since £1 was no longer what it was. Such association of money with a date is well recognized for its importance in very many connections. Thus arises the need to know how to convert an amount at one date into the right amount at another date. In other words, a price index. The longstanding question concerning how such an index should be constructed is known as ‘The Index Number Problem’. The ordinary consumer price index or CPI represents a practical response to the need. The truth of a price index is an issue giving rise to extensive thought and theory to which an impressive number of economists have each contributed. However, there have been hold-ups at a basic level. The approach brings the subject into involvement with constructions on the basis of finite data, in particular of price indices, and of utility, already well known in a form usually referred to as ‘Afriat's Theorem’. But utility is subject to constant returns, also possibly approximate. Despite a general importance for economic life and decades of outstanding professional attention, there had been no resolution of the Index Number Problem, nor had there been a real idea what could be meant by such a resolution. However, the method now proposed does convey what could be meant, and it undoubtedly represents the resolution.Less
A theft amounting to £1 was a capital offence in 1260 and a judge in 1610 affirmed the law could not then be applied since £1 was no longer what it was. Such association of money with a date is well recognized for its importance in very many connections. Thus arises the need to know how to convert an amount at one date into the right amount at another date. In other words, a price index. The longstanding question concerning how such an index should be constructed is known as ‘The Index Number Problem’. The ordinary consumer price index or CPI represents a practical response to the need. The truth of a price index is an issue giving rise to extensive thought and theory to which an impressive number of economists have each contributed. However, there have been hold-ups at a basic level. The approach brings the subject into involvement with constructions on the basis of finite data, in particular of price indices, and of utility, already well known in a form usually referred to as ‘Afriat's Theorem’. But utility is subject to constant returns, also possibly approximate. Despite a general importance for economic life and decades of outstanding professional attention, there had been no resolution of the Index Number Problem, nor had there been a real idea what could be meant by such a resolution. However, the method now proposed does convey what could be meant, and it undoubtedly represents the resolution.