Displaying 6 - 10 of 34103

Heather Joshi

Heather Joshi is Professor of Economic Democracy and Deputy Director of the Social Statistics Research Unit at City University, London.

  • Unequal Pay for Women and Men

    Unequal Pay for Women and Men

    Evidence from the British Birth Cohort Studies

    Heather Joshi and Pierella Paci

    The authors argue that no amount of training, maternity and parental leave, or child care provisions will change women's economic status if pay treatment remains unequal—if the market values men's time more than women's.

    For most of recorded history, men's pay has tended to be higher than women's. This both reflects and underpins gender roles, with men's authority more highly valued socially as well as economically. In Unequal Pay for Women and Men, Heather Joshi and Pierella Paci look at why gender pay inequality matters. They argue that no amount of training, maternity and parental leave, or child care provisions will change women's economic status if pay treatment remains unequal—if the market values men's time more than women's.

    The book is the result of an extensive study of the relative wages of British men and women between 1978 and 1991. Using two large and extremely detailed longitudinal data sets, one of women and men born in 1946, and the other of women and men born in 1958, the authors examine the evolution of the pay gap over time and evaluate the success of policies designed to establish equal pay.

    Although the book focuses mainly on Britain, the results are of interest to labor economists in other countries, as well as to researchers in other fields studying the changing role of women in the labor force.

    • Hardcover $11.75
    • Paperback $20.00

David Joselit

David Joselit is Professor and Chair in the Department of Art, Film, and Visual Studies at Harvard University. His most recent book, Heritage and Debt: Art in Globalization (MIT Press), received the 2021 Robert Motherwell Book Award.

  • Ambulance Chasers

    Abraham Adams and David Joselit

    A series of photographic diptychs that investigate the behavior of images and offer an account of American precarity.

    Ambulance Chasers offers a series of photographic diptychs by the artist Abraham Adams: on the left, the faces of personal injury lawyers photographed from roadside billboards; on the right, the landscapes they survey. The gesture is a double rotation: each photograph is imagined as the spectator of the other, and in each pairing, the exorbitant promises of the animated lawyers are deflated by their juxtaposition with an often featureless roadside landscape. The ambulance chasers smile, grin, grimace, scowl; their hair is neatly coiffed, slicked back, unnaturally dark. They gaze at country roads, busy highways, empty intersections, blue skies, building sites, and parking lots. They offer assistance—at a price. Adams's conceptual performance and art historian David Joselit's text tell a story of American precarity.

    Joselit's text unrolls alongside the photographs like a long, broken caption. Adams and Joselit conceived their collaboration as an investigation of the behavior and poetics of images—both in the world as billboards and in the book as reproductions—in a visual and textual language quite different from standard theoretical texts. In a long interview, they explore the project's aesthetic and historical concerns, focusing on its hybridization of typologies central to post–World War II photography—the conceptual catalogs best exemplified by the work of Bernd and Hilla Becher and their students, and the “anti-heroic” American landscape, as charted by artists ranging from Ed Ruscha to Lewis Baltz and Robert Adams.

    • Hardcover $49.95
  • Heritage and Debt

    Heritage and Debt

    Art in Globalization

    David Joselit

    How global contemporary art reanimates the past as a resource for the present, combating modern art's legacy of Eurocentrism.

    If European modernism was premised on the new—on surpassing the past, often by assigning it to the “traditional” societies of the Global South—global contemporary art reanimates the past as a resource for the present. In this account of what globalization means for contemporary art, David Joselit argues that the creative use of tradition by artists from around the world serves as a means of combatting modern art's legacy of Eurocentrism. Modernism claimed to live in the future and relegated the rest of the world to the past. Global contemporary art shatters this myth by reactivating various forms of heritage—from literati ink painting in China to Aboriginal painting in Australia—in order to propose new and different futures. Joselit analyzes not only how heritage becomes contemporary through the practice of individual artists but also how a cultural infrastructure of museums, biennials, and art fairs worldwide has emerged as a means of generating economic value, attracting capital and tourist dollars.

    Joselit traces three distinct forms of modernism that developed outside the West, in opposition to Euro-American modernism: postcolonial, socialist realism, and the underground. He argues that these modern genealogies are synchronized with one another and with Western modernism to produce global contemporary art. Joselit discusses curation and what he terms “the curatorial episteme,” which, through its acts of framing or curating, can become a means of recalibrating hierarchies of knowledge—and can contribute to the dual projects of decolonization and deimperialization.

    • Hardcover $40.00
  • Feedback

    Feedback

    Television against Democracy

    David Joselit

    In a world where politics is conducted through images, the tools of art history can be used to challenge the privatized antidemocratic sphere of American television.

    American television embodies a paradox: it is a privately owned and operated public communications network that most citizens are unable to participate in except as passive specators. Television creates an image of community while preventing the formation of actual social ties because behind its simulated exchange of opinions lies a highly centralized corporate structure that is profoundly antidemocratic. In Feedback, David Joselit describes the privatized public sphere of television and recounts the tactics developed by artists and media activists in the 1960s and 1970s to break open its closed circuit.

    The figures whose work Joselit examines—among them Nam June Paik, Dan Graham, Joan Jonas, Abbie Hoffman, Andy Warhol, and Melvin Van Peebles—staged political interventions within television's closed circuit. Joselit identifies three kinds of image-events: feedback, which can be both disabling noise and rational response—as when Abbie Hoffman hijacked television time for the Yippies with flamboyant stunts directed to the media; the image-virus, which proliferates parasitically, invading, transforming, and even blocking systems—as in Nam June Paik's synthesized videotapes and installations; and the avatar, a quasi-fictional form of identity available to anyone, which can function as a political actor—as in Melvin Van Peebles's invention of Sweet Sweetback, an African-American hero who appealed to a broad audience and influenced styles of Black Power activism. These strategies, writes Joselit, remain valuable today in a world where the overlapping information circuits of television and the Internet offer different opportunities for democratic participation.

    In Feedback, Joselit analyzes such midcentury image-events using the procedures and categories of art history. The trope of figure/ground reversal, for instance, is used to assess acts of representation in a variety of media—including the medium of politics. In a televisual world, Joselit argues, where democracy is conducted through images, art history has the capacity to become a political science.

    • Hardcover $5.75
    • Paperback $19.95
  • Infinite Regress

    Infinite Regress

    Marcel Duchamp 1910-1941

    David Joselit

    In Infinite Regress, David Joselit considers the plurality of identities and practices within Duchamp's life and art between 1910 and 1941, conducting a synthetic reading of his early and middle career.

    There is not one Marcel Duchamp, but several. Within his oeuvre Duchamp practiced a variety of modernist idioms and invented an array of contradictory personas: artist and art dealer, conceptualist and craftsman, chess champion and dreamer, dandy and recluse. In Infinite Regress, David Joselit considers the plurality of identities and practices within Duchamp's life and art between 1910 and 1941, conducting a synthetic reading of his early and middle career. Taking into account underacknowledged works and focusing on the conjunction of the machine and the commodity in Duchamp's art, Joselit notes a consistent opposition between the material world and various forms of measurement, inscription, and quantification. Challenging conventional accounts, he describes the readymade strategy not merely as a rejection of painting, but as a means of producing new models of the modern self.

    • Paperback $35.00
  • Infinite Regress

    Infinite Regress

    Marcel Duchamp 1910–1941

    David Joselit

    This synthetic reading of Marcel Duchamp considers the enigmatic artists's multiple personas and artistic strategies.

    There is not one Marcel Duchamp, but several. Within his oeuvre Duchamp practiced a variety of modernist idioms and invented an array of contradictory personas: artist and art dealer, conceptualist and craftsman, chess champion and dreamer, dandy and recluse. In Infinite Regress, David Joselit considers the plurality of identities and practices within Duchamp's life and art between 1910 and 1941, conducting a synthetic reading of his early and middle career. Taking into account underacknowledged works and focusing on the conjunction of the machine and the commodity in Duchamp's art, Joselit notes a consistent opposition between the material world and various forms of measurement, inscription, and quantification. Challenging conventional accounts, he describes the readymade strategy not merely as a rejection of painting, but as a means of producing new models of the modern self.

    • Hardcover $60.00
  • Utopia Post Utopia

    Configurations of Nature and Culture in Recent Sculpture and Photography

    Fredric Jameson, Alice Jardine, Abigail Solomon-Godeau, Éric Michaud, Elisabeth Sussman, and David Joselit

    Much of the art and art theory of the 1980s has addressed the question Abigail Solomon-Godeau asks in her essay for this book: whether "the art object can carve a place for itself outside the determinations of the already-written, the already-seen, the sign." Utopia Post Utopia takes up the debate on this issue which has crystallized around the theoretical opposition between nature and culture, or more specifically the analysis of a nature (human and otherwise) which is culturally produced. Utopia Post Utopia approaches the nature-culture opposition from both the point of view of the lingering nostalgia for an essential nature, as well as the aggressive replacement of "reality" with simulations of both the natural and man-made environment. It documents two shows: a sculptural installation conceived by Robert Gober including work by himself, Meg Webster, and Richard Prince; and an exhibition of photography by James Welling, Oliver Wasow, Dorit Cypis, Lorna Simpson, Jeff Wall, and Larry Johnson. In addition to Abigail Solomon-Godeau's contribution, essays by Fredric Jameson, Alice Jardine, Eric Michaud, Elisabeth Sussman and David Joselit critically examine such issues as the problematic nature of utopian impulses in recent art (Jameson); the question of authenticity (Jardine); the shifting relationship between the represented and real worlds (Michaud); the phenomenon of collaboration and ensemble in recent art production (Sussman); and meaning of photographic serialization and superimposition (Joselit). Distributed for the Institute of Contemporary Art, Boston where Elisabeth Sussman is Chief Curator and David Joselit Curator.

    • Paperback $13.95

Contributor

  • Painting beyond Itself

    Painting beyond Itself

    The Medium in the Post-Medium Condition

    Isabelle Graw and Ewa Lajer-Burcharth

    In response to recent developments in pictorial practice and critical discourse, Painting beyond Itself: The Medium in the Post-medium Condition seeks new ways to approach and historicize the question of the medium. Reaching back to the earliest theoretical and institutional definitions of painting, this book—based on a conference at Harvard University in 2013—focuses on the changing role of materiality in establishing painting as the privileged practice, discourse, and institution of modernity. Myriad conceptions of the medium and its specificity are explored by an international group of scholars, critics, and artists. Painting beyond Itself is a forum for rich historical, theoretical, and practice-grounded conversation.

    ContributorsCarol Armstrong, Benjamin H. D. Buchloh, Sabeth Buchmann, René Démoris, Isabelle Graw, David Joselit, Jutta Koether, Ewa Lajer-Burcharth, Jacqueline Lichtenstein, Julie Mehretu, Matt Saunders, Amy Sillman

    Institut für Kunstkritik Series

    • Paperback $19.95
  • Super Vision

    Super Vision

    Nicholas Baume

    Leading contemporary artists, including Bridget Riley, Jeff Koons, Mona Hatoum, Andreas Gursky, and Yoko Ono, explore the ecstatic and the threatening aspects of contemporary visual experience.

    New technology enables super vision—both superhuman visual powers and actual supervision by surveillance. In Super Vision, which accompanies the inaugural exhibit at the new Institute of Contemporary Art, Boston, a broad selection of important works in a variety of media expresses both the ecstatic and the threatening aspects of vision and reveals visual experience as a source of both pleasure and fear. These works reflect the digital era's profound shift in the nature of visuality itself—as computer graphics and imaging, digitization, and virtuality have transformed both the nature of representation and our relationship to it. Among the leading contemporary artists exploring the changing nature of contemporary visual experience in Super Vision are Bridget Riley, Anish Kapoor, and Gabriel Orozco, with works that bend, twist, and dissolve space, leaving us unsure of the boundaries between inside and outside, surface and depth, self and others. Other works by artists including Jeff Koons, Julie Mehretu, and Andreas Gursky, express aspects of virtuality—some explicitly, some more subtly—and explore the changes in the way we see and understand two-dimensional images. Vision in the twenty-first century is potentially everywhere, all the time; there is no way to escape it. Works by Sigmar Polke, Yoko Ono, Tony Oursler, Thomas Ruff, and others respond in complex ways to this disembodied and penetrating quality of vision. The many full-color images in Super Vision are accompanied by essays by exhibition curator Nicholas Baume, art historian David Joselit, and media theorist McKenzie Wark. Copublished with the Institute of Contemporary Art, Boston.

    • Hardcover $8.75

Dale W. Jorgenson

Dale W. Jorgenson is Samuel W. Morris University Professor of Economics at Harvard University.

  • Double Dividend

    Double Dividend

    Environmental Taxes and Fiscal Reform in the United States

    Dale W. Jorgenson, Richard J. Goettle, Mun S. Ho, and Peter J. Wilcoxen

    A rigorous and innovative approach for integrating environmental policies and fiscal reform for the U.S. economy.

    Energy utilization, especially from fossil fuels, creates hidden costs in the form of pollution and environmental damages. The costs are well documented but are hidden in the sense that they occur outside the market, are not reflected in market prices, and are not taken into account by energy users. Double Dividend presents a novel method for designing environmental taxes that correct market prices so that they reflect the true cost of energy. The resulting revenue can be used in reducing the burden of the overall tax system and improving the performance of the economy, creating the double dividend of the title.

    The authors simulate the impact of environmental taxes on the U.S. economy using their Intertemporal General Equilibrium Model (IGEM). This highly innovative model incorporates expectations about future prices and policies. The model is estimated econometrically from an extensive 50-year dataset to incorporate the heterogeneity of producers and consumers. This approach generates confidence intervals for the outcomes of changes in economic policies, a new feature for models used in analyzing energy and environmental policies. These outcomes include the welfare impacts on individual households, distinguished by demographic characteristics, and for society as a whole, decomposed between efficiency and equity.

    • Hardcover $84.00
  • Productivity, Volume 3

    Productivity, Volume 3

    Information Technology and the American Growth Resurgence

    Dale W. Jorgenson, Mun S. Ho, and Kevin Stiroh

    A study of information technology and economic growth since 1995 that tracks the American growth resurgence to its sources within individual industries.

    The American economy has experienced renewed growth since 1995, with this surge rooted in the development and deployment of information technology (IT). This book traces the American growth resurgence to its sources within individual industries, documents the critical role of IT, and shows how U.S. nvestment in IT has important parallels in other developed countries.In analyzing the experience in the United States, the authors identify four IT-producing industries, 17 IT-using industries, and 23 non-IT industries and show that the IT-producing and IT-using industries play a disproportionate role in the American growth resurgence. These industries account for only about 30 percent of US GDP but contributed half of the acceleration in economic growth. The study finds that differences in the relative importance of IT-producing industries in other G7 countries have contributed to wide disparities in the impact of IT on economic growth.

    Productivity, Volume 3 will be of special interest to analysts of the "new economy" and its remarkable persistence through periods of boom and recession.

    • Hardcover $50.00
  • Econometrics, Volume 3

    Econometrics, Volume 3

    Economic Growth in the Information Age

    Dale W. Jorgenson

    Studies of the relation between information technology and economic growth trends.

    The relentless decline in the prices of information technology (IT) has steadily enhanced the role of IT investment as a source of economic growth in the United States. Productivity growth in IT-producing industries has gradually risen in importance, and a productivity revival has taken place in the rest of the economy. In this book Dale Jorgenson shows that IT provides the foundation for the resurgence of American economic growth.

    Information technology rests in turn on the development and deployment of semiconductors–transistors, storage devices, and microprocessors. The semiconductor and IT industries are global in scope, with an elaborate international division of labor. This poses important questions about the American growth resurgence. For example, where is the evidence of the "new economy" in other leading industrialized nations? To address this question, Jorgenson compares the recent growth performance in the G7 countries–Canada, France, Germany, Italy, Japan, the United Kingdom, and the United States. Several important participants in the IT industries, such as South Korea, Malaysia, Singapore, and Taiwan, are newly industrializing economies. What does this portend for the future economic growth of developing countries? Jorgenson analyzes past and future growth trends in China and Taiwan to arrive at a fuller understanding of economic growth in the information age.

    • Hardcover $70.00
    • Paperback $35.00
  • Investment, Volume 3

    Investment, Volume 3

    Lifting the Burden: Tax Reform, the Cost of Capital, and U.S. Economic Growth

    Dale W. Jorgenson and Kun-Young Yun

    A presentation of the cost-of-capital approach for analyzing the economic impact of tax policy.

    This book presents a comprehensive treatment of the cost-of-capital approach for analyzing the economic impact of tax policy. This approach has provided an intellectual impetus for reforms of capital income taxation in the United States and around the world. The cost of capital and the marginal effective tax rate are combined with estimates of substitution possibilities by businesses and households in analyzing tax and spending programs. This makes it possible to evaluate tax reforms and changes in government spending. Studies of the economic impact of tax policies have taken two forms. First, the cost of capital has been incorporated into investment functions in macroeconomic models, which are used to model the short-run responses to tax policy changes. Second, the cost-of-capital approach has been integrated into applied general-equilibrium models used in evaluating the long-run economic effects of tax reforms.

    The cost-of-capital approach suggests two avenues for tax reform. One would retain the income tax base of the existing U.S. tax system, but would equalize tax burdens on all forms of assets as well as average and marginal tax rates on labor income. The other would substitute consumption for income as a tax base, while equating average and marginal tax rates on labor income.

    • Hardcover $15.75
    • Paperback $40.00
  • Econometrics, Volume 1

    Econometrics, Volume 1

    Econometric Modeling of Producer Behavior

    Dale W. Jorgenson

    New methodology for econometricians, based on the dual formulation of the theory of production in terms of prices.

    The objectives of econometric modeling of producer behavior are to determine the nature of substitution among inputs and outputs and of differences in technology, as well as the role of economies of scale in production. Recent advances in methodology, based on the dual formulation of the theory of production in terms of prices, have enabled econometricians to achieve these objectives more effectively. This volume summarizes the economic theory, the econometric methodology, and the empirical findings resulting from the new approach.

    • Hardcover $70.00
  • Growth, Volume 2

    Growth, Volume 2

    Energy, the Environment, and Economic Growth

    Dale W. Jorgenson

    Volume 1: Econometric General Equilibrium Modeling presents an econometric approach to general equilibrium modeling of the impact of economic policies. Earlier approaches were based on the "calibration" of general equilibrium models to a single data point. The obvious disadvantage of calibration is that it requires highly restrictive assumptions about technology and preferences, such as fixed input-output coefficients. These assumptions are contradicted by the massive evidence of energy conservation in response to higher world energy prices, beginning in 1973. The econometric approach to general equilibrium modeling successfully freed economic policy analysis from the straitjacket imposed by calibration.

    As a consequence of changes in energy prices and new environmental policies, a wealth of historical experience has accumulated over the past two decades. Interpreted within the framework of the neoclassical theory of economic growth, this experience provides essential guidelines for future policy formation. Volume 2: Energy, the Environment, and Economic Growth presents a new econometric general equilibrium model of the United States that captures the dynamic mechanisms underlying growth trends and responses to energy and environmental policies. Jorgenson uses the model to analyze the impacts of environmental regulations on US economic growth and tax policies for controlling US emissions of carbon dioxide.

    • Hardcover $70.00
    • Paperback $40.00
  • Growth, Volume 1

    Growth, Volume 1

    Econometric General Equilibrium Modeling

    Dale W. Jorgenson

    Volume 1: Econometric General Equilibrium Modeling presents an econometric approach to general equilibrium modeling of the impact of economic policies. Earlier approaches were based on the "calibration" of general equilibrium models to a single data point. The obvious disadvantage of calibration is that it requires highly restrictive assumptions about technology and preferences, such as fixed input-output coefficients. These assumptions are contradicted by the massive evidence of energy conservation in response to higher world energy prices, beginning in 1973. The econometric approach to general equilibrium modeling successfully freed economic policy analysis from the straitjacket imposed by calibration.

    As a consequence of changes in energy prices and new environmental policies, a wealth of historical experience has accumulated over the past two decades. Interpreted within the framework of the neoclassical theory of economic growth, this experience provides essential guidelines for future policy formation. Volume 2: Energy, the Environment, and Economic Growth presents a new econometric general equilibrium model of the United States that captures the dynamic mechanisms underlying growth trends and responses to energy and environmental policies. Jorgenson uses the model to analyze the impacts of environmental regulations on US economic growth and tax policies for controlling U.S. emissions of carbon dioxide.

    • Hardcover $70.00
  • Welfare, Volume 1

    Welfare, Volume 1

    Aggregate Consumer Behavior

    Dale W. Jorgenson

    This volume presents a new approach to econometric modeling of aggregate consumer behavior. The approach has successfully extricated demand modeling from the highly restrictive framework provided for more than half a century by the model of a representative consumer. Like the representative consumer model that preceded it, the new approach rests on the theory of individual behavior. The centerpiece of the volume is an econometric model of demand obtained by aggregating over a population of utility-maximizing consumers.

    The essential innovation is to incorporate attributes of consumers reflecting heterogeneous preferences into a model of aggregate behavior. Heterogeneity is captured by allowing preferences to depend on the demographic characteristics of households. This model unifies the two principal streams of empirical research on consumer behavior by pooling aggregate time series with cross-section data for individual households and provides a new point of departure for future research.

    • Hardcover $70.00
  • Welfare, Volume 2

    Welfare, Volume 2

    Measuring Social Welfare

    Dale W. Jorgenson

    This volume presents an approach to the evaluation of economic policies through the econometric modeling of aggregate consumer behavior. While the preferences of individual consumers are revealed by their market choices, these preferences can be recovered only by econometric methods, not through the index numbers used in the official statistics. The richer and more robust methodology presented in this volume provides a fruitful point of departure for future policy evaluations.

    The econometric approach replaces ordinal measures of individual welfare that cannot be compared among individuals with cardinal measures that can. These are combined into an indicator of social welfare that reflects principles of horizontal and vertical equity. This approach unifies the measurement of poverty, inequality, and cost and standard of living. It extends the scope of normative economics to a broader range of issues in the evaluation of economic and social policies.

    • Hardcover $70.00
    • Paperback $35.00
  • Investment, Volume 1

    Investment, Volume 1

    Capital Theory and Investment Behavior

    Dale W. Jorgenson

    These studies of the cost of capital will inspire and guide policy-makers who share the goal of making the allocation of capital in a market economy more efficient.

    Volume 1 presents pioneering studies of the cost of capital as a determinant of investment expenditures. The cost of capital summarizes the future consequences of investment essential for current decisions. This concept has become an indispensible tool for studying the dynamics of investment behavior. Both macroeconome tric models and intertemporal general equilibrium models have employed the cost of capital as a determinant of short- and long-term investment expenditures.

    • Hardcover $15.75
  • Investment, Volume 2

    Investment, Volume 2

    Tax Policy and the Cost of Capital

    Dale W. Jorgenson

    These studies of the cost of capital will inspire and guide policy-makers who share the goal of making the allocation of capital in a market economy more efficient.

    Volume 2 is devoted to the cost of capital approach to tax policy. This approach has supplied an important intellectual impetus for reforms of capital income taxation in the United States and around the world. Widespread applications of the cost of capital and the closely related concept of the marginal effective tax rate are due to the fact that these concepts facilitate the representation of economically relevant features of complex tax statutes in a highly succinct form.

    • Hardcover $70.00
  • Productivity, Volume 1

    Productivity, Volume 1

    Postwar U.S. Economic Growth

    Dale W. Jorgenson

    These two volumes present empirical studies that have permanently altered professional debates over investment and productivity as sources of postwar economic growth in industrialized countries. The distinctive feature of investment is that returns can be internalized by the investor. The most straightforward application of this idea is to investments that create property rights, but these volumes broaden the meaning of capital formation to include investments in education and training.

    Postwar U.S. Economic Growth traces the outstanding postwar performance of the U.S. economy to investments in tangible assets and human capital. This volume provides the starting point for a new consensus on policies to generate growth by stimulating and rewarding investments. These policies will focus on returns that can be internalized by investors, ending the fruitless search for "spill overs" that can generate substantial growth without providing incentives for capital formation.

    • Hardcover $70.00
    • Paperback $35.00
  • Productivity, Volume 2

    Productivity, Volume 2

    International Comparisons of Economic Growth

    Dale W. Jorgenson

    These two volumes present empirical studies that have permanently altered professional debates over investment and productivity as sources of postwar economic growth in industrialized countries. The distinctive feature of investment is that returns can be internalized by the investor. The most straightforward application of this idea is to investments that create property rights, but these volumes broaden the meaning of capital formation to include investments in education and training.

    International Comparisons of Economic Growth focuses on comparisons among industrialized countries. Although Germany and Japan are often portrayed as economic adversaries of the U.S., postwar experiences in all three countries support policies that give high priority to stimulating and rewarding capital formation. In the Asian model of growth exemplified by Japan investments in tangible assets and human capital are especially critical during periods of rapid growth.

    • Hardcover $70.00
    • Paperback $40.00
  • Technology and Capital Formation

    Dale W. Jorgenson and Ralph Landau

    The contributions in this book bring a wealth of detailed empirical data and an unusually wide range of perspectives—from universities, government, and business—to bear on the exploration of this important interrelationship; they focus, in particular, on the role of capital in the production process.

    Capital formation is the most important source of economic growth, and investment in new capital interacts in key ways with the diffusion of new technology. The contributions in this book bring a wealth of detailed empirical data and an unusually wide range of perspectives—from universities, government, and business—to bear on the exploration of this important interrelationship; they focus, in particular, on the role of capital in the production process. Grouped into three broad categories, they take up the rate of technological advance and investment in computers, the relative efficiency of new and old capital goods, and the translation of capital formation into productive inputs in the private and government sectors of the U.S. economy.

    Dale W. Jorgenson looks at previous research to explain the controversy that began in the 1960s regarding capital as a factor of production. Computer prices are examined extensively and in great detail in two important studies by Ellen Dulberger and Robert Gordon, while Jack Triplett discusses the economic and engineering literature on the subject. Empirical research by Charles Hulten, James Robertson, and Frank Wykoff disproves the hypothesis that deterioration in the efficiency of older capital goods as a result of the 1970s energy crisis explains the subsequent slowdown in production growth. Wykoff offers a particularly rich study of the depreciation of business leased automobiles.

    Other ContributorsPaul Pieper on the state of construction price statistics; Michael Harper, Ernst Berndt, and David Wood on alternative approaches to measuring the rate of return; John Strong on the market value of debt claims in U.S. financial markets; Dianne and Laurits Christensen, Carl Degen and Philip Schoech on the U.S. Postal Service; Michael Boskin, Marc Robinson and John Roberts on estimating federal government capital and net investment; and Ralph Landau on the interrelationship of technology and capital formation

    • Hardcover $60.00

Contributor

  • Clearer Skies Over China

    Clearer Skies Over China

    Reconciling Air Quality, Climate, and Economic Goals

    Chris P. Nielsen and Mun S. Ho

    A groundbreaking U.S.–Chinese inquiry into the effects of recent air pollution controls and prospective carbon taxes on China's economy and environment.

    China's carbon dioxide emissions now outstrip those of other countries and its domestic air quality is severely degraded, especially in urban areas. Its sheer size and its growing, fossil-fuel-powered economy mean that China's economic and environmental policy choices will have an outsized effect on the global environmental future. Over the last decade, China has pursued policies that target both fossil fuel use and atmospheric emissions, but these efforts have been substantially overwhelmed by the country's increasing energy demands. With a billion citizens still living on less than $4,000 per year, China's energy and environmental policies must be reconciled with the goals of maintaining economic growth and raising living standards.

    This book, a U.S.–Chinese collaboration of experts from Harvard and Tsinghua University, offers a groundbreaking integrated analysis of China's economy, emissions, air quality, public health, and agriculture. It first offers essential scientific context and accessible summaries of the book's policy findings; it then provides the underlying scientific and economic research. These studies suggest that China's recent sulfur controls achieved enormous environmental health benefits at unexpectedly low costs. They also indicate that judicious implementation of carbon taxes could reduce not only China's carbon emissions but also its air pollution more comprehensively than current single-pollutant policies, all at little cost to economic growth.

    • Hardcover $58.00

Michael I. Jordan

Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley, and recipient of the ACM/AAAI Allen Newell Award.

  • Graphical Models

    Graphical Models

    Foundations of Neural Computation

    Michael I. Jordan and Terrence J. Sejnowski

    This book exemplifies the interplay between the general formal framework of graphical models and the exploration of new algorithm and architectures. The selections range from foundational papers of historical importance to results at the cutting edge of research.

    Graphical models use graphs to represent and manipulate joint probability distributions. They have their roots in artificial intelligence, statistics, and neural networks. The clean mathematical formalism of the graphical models framework makes it possible to understand a wide variety of network-based approaches to computation, and in particular to understand many neural network algorithms and architectures as instances of a broader probabilistic methodology. It also makes it possible to identify novel features of neural network algorithms and architectures and to extend them to more general graphical models.This book exemplifies the interplay between the general formal framework of graphical models and the exploration of new algorithms and architectures. The selections range from foundational papers of historical importance to results at the cutting edge of research.

    ContributorsH. Attias, C. M. Bishop, B. J. Frey, Z. Ghahramani, D. Heckerman, G. E. Hinton, R. Hofmann, R. A. Jacobs, Michael I. Jordan, H. J. Kappen, A. Krogh, R. Neal, S. K. Riis, F. B. Rodríguez, L. K. Saul, Terrence J. Sejnowski, P. Smyth, M. E. Tipping, V. Tresp, Y. Weiss

    • Paperback $50.00
  • Learning in Graphical Models

    Learning in Graphical Models

    Michael I. Jordan

    Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering—uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.

    This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters—Robert Cowell on Inference for Bayesian Networks, David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.

    • Paperback $80.00
  • Advances in Neural Information Processing Systems 10

    Advances in Neural Information Processing Systems 10

    Proceedings of the 1997 Conference

    Michael I. Jordan, Michael J. Kearns, and Sara A. Solla

    The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented.

    • Hardcover $20.75
  • Advances in Neural Information Processing Systems 9

    Advances in Neural Information Processing Systems 9

    Proceedings of The 1996 Conference

    Michael C. Mozer, Michael I. Jordan, and Thomas Petsche

    The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes neural networks and genetic algorithms, cognitive science, neuroscience and biology, computer science, AI, applied mathematics, physics, and many branches of engineering. Only about 30% of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. All of the papers presented appear in these proceedings.

    • Hardcover $20.75
  • Advances in Neural Information Processing Systems

    Advances in Neural Information Processing Systems

    Proceedings of the First 12 Conferences

    Michael I. Jordan, Yann LeCun, and Sara A. Solla

    The complete twelve-volume proceedings of the Neural Information Processing Systems conferences from 1988 to 1999 on CD-ROM.

    The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This CD-ROM contains the entire proceedings of the twelve Neural Information Processing Systems conferences from 1988 to 1999. The files are available in the DjVu image format developed by Yann LeCun and his group at AT&T Labs. The CD-ROM includes free browsers for all major platforms.

    • CD-ROM $75.00

Contributor

  • An Introduction to Lifted Probabilistic Inference

    An Introduction to Lifted Probabilistic Inference

    Guy Van den Broeck, Kristian Kersting, Sriraam Natarajan, and David Poole

    Recent advances in the area of lifted inference, which exploits the structure inherent in relational probabilistic models.

    Statistical relational AI (StaRAI) studies the integration of reasoning under uncertainty with reasoning about individuals and relations. The representations used are often called relational probabilistic models. Lifted inference is about how to exploit the structure inherent in relational probabilistic models, either in the way they are expressed or by extracting structure from observations. This book covers recent significant advances in the area of lifted inference, providing a unifying introduction to this very active field.

    After providing necessary background on probabilistic graphical models, relational probabilistic models, and learning inside these models, the book turns to lifted inference, first covering exact inference and then approximate inference. In addition, the book considers the theory of liftability and acting in relational domains, which allows the connection of learning and reasoning in relational domains.

    Contributors

    Babak Ahmadi, Hendrik Blockeel, Hung Bui, Yuqiao Chen, Arthur Choi, Jaesik Choi, Adnan Darwiche, Jesse Davis, Rodrigo de Salvo Braz, Pedro Domingos, Daan Fierens, Martin Grohe, Fabian Hadiji, Seyed Mehran Kazemi, Kristian Kersting, Roni Khardon, Angelika Kimmig, Jacek Kisyński, Daniel Lowd, Wannes Meert, Martin Mladenov, Raymond Mooney, Sriraam Natarajan, Mathias Niepert, David Poole, Scott Sanner, Pascal Schweitzer, Nima Taghipour, Guy Van den Broeck

    • Paperback $70.00
  • Log-Linear Models, Extensions, and Applications

    Log-Linear Models, Extensions, and Applications

    Aleksandr Aravkin, Anna Choromanska, Li Deng, Georg Heigold, Tony Jebara, Dimitri Kanevsky, and Stephen J. Wright

    Advances in training models with log-linear structures, with topics including variable selection, the geometry of neural nets, and applications.

    Log-linear models play a key role in modern big data and machine learning applications. From simple binary classification models through partition functions, conditional random fields, and neural nets, log-linear structure is closely related to performance in certain applications and influences fitting techniques used to train models. This volume covers recent advances in training models with log-linear structures, covering the underlying geometry, optimization techniques, and multiple applications. The first chapter shows readers the inner workings of machine learning, providing insights into the geometry of log-linear and neural net models. The other chapters range from introductory material to optimization techniques to involved use cases. The book, which grew out of a NIPS workshop, is suitable for graduate students doing research in machine learning, in particular deep learning, variable selection, and applications to speech recognition. The contributors come from academia and industry, allowing readers to view the field from both perspectives.

    ContributorsAleksandr Aravkin, Avishy Carmi, Guillermo A. Cecchi, Anna Choromanska, Li Deng, Xinwei Deng, Jean Honorio, Tony Jebara, Huijing Jiang, Dimitri Kanevsky, Brian Kingsbury, Fabrice Lambert, Aurélie C. Lozano, Daniel Moskovich, Yuriy S. Polyakov, Bhuvana Ramabhadran, Irina Rish, Dimitris Samaras, Tara N. Sainath, Hagen Soltau, Serge F. Timashev, Ewout van den Berg

    • Hardcover $75.00
  • Perturbations, Optimization, and Statistics

    Perturbations, Optimization, and Statistics

    Tamir Hazan, George Papandreou, and Daniel Tarlow

    A description of perturbation-based methods developed in machine learning to augment novel optimization methods with strong statistical guarantees.

    In nearly all machine learning, decisions must be made given current knowledge. Surprisingly, making what is believed to be the best decision is not always the best strategy, even when learning in a supervised learning setting. An emerging body of work on learning under different rules applies perturbations to decision and learning procedures. These methods provide simple and highly efficient learning rules with improved theoretical guarantees. This book describes perturbation-based methods developed in machine learning to augment novel optimization methods with strong statistical guarantees, offering readers a state-of-the-art overview.

    Chapters address recent modeling ideas that have arisen within the perturbations framework, including Perturb & MAP, herding, and the use of neural networks to map generic noise to distribution over highly structured data. They describe new learning procedures for perturbation models, including an improved EM algorithm and a learning algorithm that aims to match moments of model samples to moments of data. They discuss understanding the relation of perturbation models to their traditional counterparts, with one chapter showing that the perturbations viewpoint can lead to new algorithms in the traditional setting. And they consider perturbation-based regularization in neural networks, offering a more complete understanding of dropout and studying perturbations in the context of deep neural networks.

    • Hardcover $60.00
  • Advanced Structured Prediction

    Advanced Structured Prediction

    Sebastian Nowozin, Peter V. Gehler, Jeremy Jancsary, and Christoph H. Lampert

    An overview of recent work in the field of structured prediction, the building of predictive machine learning models for interrelated and dependent outputs.

    The goal of structured prediction is to build machine learning models that predict relational information that itself has structure, such as being composed of multiple interrelated parts. These models, which reflect prior knowledge, task-specific relations, and constraints, are used in fields including computer vision, speech recognition, natural language processing, and computational biology. They can carry out such tasks as predicting a natural language sentence, or segmenting an image into meaningful components.

    These models are expressive and powerful, but exact computation is often intractable. A broad research effort in recent years has aimed at designing structured prediction models and approximate inference and learning procedures that are computationally efficient. This volume offers an overview of this recent research in order to make the work accessible to a broader research community. The chapters, by leading researchers in the field, cover a range of topics, including research trends, the linear programming relaxation approach, innovations in probabilistic modeling, recent theoretical progress, and resource-aware learning.

    Contributors Jonas Behr, Yutian Chen, Fernando De La Torre, Justin Domke, Peter V. Gehler, Andrew E. Gelfand, Sébastien Giguère, Amir Globerson, Fred A. Hamprecht, Minh Hoai, Tommi Jaakkola, Jeremy Jancsary, Joseph Keshet, Marius Kloft, Vladimir Kolmogorov, Christoph H. Lampert, François Laviolette, Xinghua Lou, Mario Marchand, André F. T. Martins, Ofer Meshi, Sebastian Nowozin, George Papandreou, Daniel Průša, Gunnar Rätsch, Amélie Rolland, Bogdan Savchynskyy, Stefan Schmidt, Thomas Schoenemann, Gabriele Schweikert, Ben Taskar, Sinisa Todorovic, Max Welling, David Weiss, Thomáš Werner, Alan Yuille, Stanislav Živný

    • Hardcover $65.00
  • Practical Applications of Sparse Modeling

    Practical Applications of Sparse Modeling

    Irina Rish, Guillermo A. Cecchi, Aurelie Lozano, and Alexandru Niculescu-Mizil

    Key approaches in the rapidly developing area of sparse modeling, focusing on its application in fields including neuroscience, computational biology, and computer vision.

    Sparse modeling is a rapidly developing area at the intersection of statistical learning and signal processing, motivated by the age-old statistical problem of selecting a small number of predictive variables in high-dimensional datasets. This collection describes key approaches in sparse modeling, focusing on its applications in fields including neuroscience, computational biology, and computer vision.

    Sparse modeling methods can improve the interpretability of predictive models and aid efficient recovery of high-dimensional unobserved signals from a limited number of measurements. Yet despite significant advances in the field, a number of open issues remain when sparse modeling meets real-life applications. The book discusses a range of practical applications and state-of-the-art approaches for tackling the challenges presented by these applications. Topics considered include the choice of method in genomics applications; analysis of protein mass-spectrometry data; the stability of sparse models in brain imaging applications; sequential testing approaches; algorithmic aspects of sparse recovery; and learning sparse latent models.

    ContributorsA. Vania Apkarian, Marwan Baliki, Melissa K. Carroll, Guillermo A. Cecchi, Volkan Cevher, Xi Chen, Nathan W. Churchill, Rémi Emonet, Rahul Garg, Zoubin Ghahramani, Lars Kai Hansen, Matthias Hein, Katherine Heller, Sina Jafarpour, Seyoung Kim, Mladen Kolar, Anastasios Kyrillidis, Seunghak Lee, Aurelie Lozano, Matthew L. Malloy, Pablo Meyer, Shakir Mohamed, Alexandru Niculescu-Mizil, Robert D. Nowak, Jean-Marc Odobez, Peter M. Rasmussen, Irina Rish, Saharon Rosset, Martin Slawski, Stephen C. Strother, Jagannadan Varadarajan, Eric P. Xing

    • Hardcover $61.00
  • Optimization for Machine Learning

    Optimization for Machine Learning

    Suvrit Sra, Sebastian Nowozin, and Stephen J. Wright

    An up-to-date account of the interplay between optimization and machine learning, accessible to students and researchers in both communities.

    The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields.Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

    • Hardcover $60.00
    • Paperback $50.00
  • Dataset Shift in Machine Learning

    Dataset Shift in Machine Learning

    Joaquin Quiñonero-Candela, Masashi Sugiyama, Anton Schwaighofer, and Neil D. Lawrence

    An overview of recent efforts in the machine learning community to deal with dataset and covariate shift, which occurs when test and training inputs and outputs have different distributions.

    Dataset shift is a common problem in predictive modeling that occurs when the joint distribution of inputs and outputs differs between training and test stages. Covariate shift, a particular case of dataset shift, occurs when only the input distribution changes. Dataset shift is present in most practical applications, for reasons ranging from the bias introduced by experimental design to the irreproducibility of the testing conditions at training time. (An example is -email spam filtering, which may fail to recognize spam that differs in form from the spam the automatic filter has been built on.) Despite this, and despite the attention given to the apparently similar problems of semi-supervised learning and active learning, dataset shift has received relatively little attention in the machine learning community until recently. This volume offers an overview of current efforts to deal with dataset and covariate shift. The chapters offer a mathematical and philosophical introduction to the problem, place dataset shift in relationship to transfer learning, transduction, local learning, active learning, and semi-supervised learning, provide theoretical views of dataset and covariate shift (including decision theoretic and Bayesian perspectives), and present algorithms for covariate shift.

    ContributorsShai Ben-David, Steffen Bickel, Karsten Borgwardt, Michael Brückner, David Corfield, Amir Globerson, Arthur Gretton, Lars Kai Hansen, Matthias Hein, Jiayuan Huang, Choon Hui Teo, Takafumi Kanamori, Klaus-Robert Müller, Sam Roweis, Neil Rubens, Tobias Scheffer, Marcel Schmittfull, Bernhard Schölkopf Hidetoshi Shimodaira, Alex Smola, Amos Storkey, Masashi Sugiyama

    • Hardcover $45.00
  • Learning Machine Translation

    Learning Machine Translation

    Cyril Goutte, Nicola Cancedda, Marc Dymetman, and George Foster

    The Internet gives us access to a wealth of information in languages we don't understand. The investigation of automated or semi-automated approaches to translation has become a thriving research field with enormous commercial potential. This volume investigates how Machine Learning techniques can improve Statistical Machine Translation, currently at the forefront of research in the field. The book looks first at enabling technologies—technologies that solve problems that are not Machine Translation proper but are linked closely to the development of a Machine Translation system. These include the acquisition of bilingual sentence-aligned data from comparable corpora, automatic construction of multilingual name dictionaries, and word alignment. The book then presents new or improved statistical Machine Translation techniques, including a discriminative training framework for leveraging syntactic information, the use of semi-supervised and kernel-based learning methods, and the combination of multiple Machine Translation outputs in order to improve overall translation quality.

    ContributorsSrinivas Bangalore, Nicola Cancedda, Josep M. Crego, Marc Dymetman, Jakob Elming, George Foster, Jesús Giménez, Cyril Goutte, Nizar Habash, Gholamreza Haffari, Patrick Haffner, Hitoshi Isahara, Stephan Kanthak, Alexandre Klementiev, Gregor Leusch, Pierre Mahé, Lluís Màrquez, Evgeny Matusov, I. Dan Melamed, Ion Muslea, Hermann Ney, Bruno Pouliquen, Dan Roth, Anoop Sarkar, John Shawe-Taylor, Ralf Steinberger, Joseph Turian, Nicola Ueffing, Masao Utiyama, Zhuoran Wang, Benjamin Wellington, Kenji Yamada

    • Hardcover $45.00
  • Large-Scale Kernel Machines

    Large-Scale Kernel Machines

    Léon Bottou, Olivier Chapelle, Dennis DeCoste, and Jason Weston

    Solutions for learning from large scale datasets, including kernel learning algorithms that scale linearly with the volume of the data and experiments carried out on realistically large datasets.

    Pervasive and networked computers have dramatically reduced the cost of collecting and distributing large datasets. In this context, machine learning algorithms that scale poorly could simply become irrelevant. We need learning algorithms that scale linearly with the volume of the data while maintaining enough statistical efficiency to outperform algorithms that simply process a random subset of the data. This volume offers researchers and engineers practical solutions for learning from large scale datasets, with detailed descriptions of algorithms and experiments carried out on realistically large datasets. At the same time it offers researchers information that can address the relative lack of theoretical grounding for many useful algorithms. After a detailed description of state-of-the-art support vector machine technology, an introduction of the essential concepts discussed in the volume, and a comparison of primal and dual optimization techniques, the book progresses from well-understood techniques to more novel and controversial approaches. Many contributors have made their code and data available online for further experimentation. Topics covered include fast implementations of known algorithms, approximations that are amenable to theoretical guarantees, and algorithms that perform well in practice but are difficult to analyze theoretically.

    ContributorsLéon Bottou, Yoshua Bengio, Stéphane Canu, Eric Cosatto, Olivier Chapelle, Ronan Collobert, Dennis DeCoste, Ramani Duraiswami, Igor Durdanovic, Hans-Peter Graf, Arthur Gretton, Patrick Haffner, Stefanie Jegelka, Stephan Kanthak, S. Sathiya Keerthi, Yann LeCun, Chih-Jen Lin, Gaëlle Loosli, Joaquin Quiñonero-Candela, Carl Edward Rasmussen, Gunnar Rätsch, Vikas Chandrakant Raykar, Konrad Rieck, Vikas Sindhwani, Fabian Sinz, Sören Sonnenburg, Jason Weston, Christopher K. I. Williams, Elad Yom-Tov

    • Hardcover $50.00
  • Predicting Structured Data

    Predicting Structured Data

    Gökhan BakIr, Thomas Hofmann, Bernhard Schölkopf, Alexander J. Smola, Ben Taskar, and S.V.N Vishwanathan

    State-of-the-art algorithms and theory in a novel domain of machine learning, prediction when the output has structure.

    Machine learning develops intelligent computer systems that are able to generalize from previously seen examples. A new domain of machine learning, in which the prediction must satisfy the additional constraints found in structured data, poses one of machine learning's greatest challenges: learning functional dependencies between arbitrary input and output domains. This volume presents and analyzes the state of the art in machine learning algorithms and theory in this novel field. The contributors discuss applications as diverse as machine translation, document markup, computational biology, and information extraction, among others, providing a timely overview of an exciting field.

    Contributors Yasemin Altun, Gökhan Bakir, Olivier Bousquet, Sumit Chopra, Corinna Cortes, Hal Daumé III, Ofer Dekel, Zoubin Ghahramani, Raia Hadsell, Thomas Hofmann, Fu Jie Huang, Yann LeCun, Tobias Mann, Daniel Marcu, David McAllester, Mehryar Mohri, William Stafford Noble, Fernando Pérez-Cruz, Massimiliano Pontil, Marc'Aurelio Ranzato, Juho Rousu, Craig Saunders, Bernhard Schölkopf, Matthias W. Seeger, Shai Shalev-Shwartz, John Shawe-Taylor, Yoram Singer, Alexander J. Smola, Sandor Szedmak, Ben Taskar, Ioannis Tsochantaridis, S.V.N Vishwanathan, Jason Weston

    • Hardcover $47.00
    • Paperback $45.00
  • Toward Brain-Computer Interfacing

    Toward Brain-Computer Interfacing

    Guido Dornhege, José del R. Millán, Thilo Hinterberger, Dennis J. McFarland, and Klaus-Robert Müller

    The latest research in the development of technologies that will allow humans to communicate, using brain signals only, with computers, wheelchairs, prostheses, and other devices.

    Interest in developing an effective communication interface connecting the human brain and a computer has grown rapidly over the past decade. The brain-computer interface (BCI) would allow humans to operate computers, wheelchairs, prostheses, and other devices, using brain signals only. BCI research may someday provide a communication channel for patients with severe physical disabilities but intact cognitive functions, a working tool in computational neuroscience that contributes to a better understanding of the brain, and a novel independent interface for human-machine communication that offers new options for monitoring and control. This volume presents a timely overview of the latest BCI research, with contributions from many of the important research groups in the field. The book covers a broad range of topics, describing work on both noninvasive (that is, without the implantation of electrodes) and invasive approaches. Other chapters discuss relevant techniques from machine learning and signal processing, existing software for BCI, and possible applications of BCI research in the real world.

    • Hardcover $49.00
    • Paperback $60.00
  • New Directions in Statistical Signal Processing

    New Directions in Statistical Signal Processing

    From Systems to Brains

    Simon Haykin, Jose C. Principe, Terrence J. Sejnowski, and John McWhirter

    Leading researchers in signal processing and neural computation present work aimed at promoting the interaction and cross-fertilization between the two fields.

    Signal processing and neural computation have separately and significantly influenced many disciplines, but the cross-fertilization of the two fields has begun only recently. Research now shows that each has much to teach the other, as we see highly sophisticated kinds of signal processing and elaborate hierachical levels of neural computation performed side by side in the brain. In New Directions in Statistical Signal Processing, leading researchers from both signal processing and neural computation present new work that aims to promote interaction between the two disciplines.The book's 14 chapters, almost evenly divided between signal processing and neural computation, begin with the brain and move on to communication, signal processing, and learning systems. They examine such topics as how computational models help us understand the brain's information processing, how an intelligent machine could solve the "cocktail party problem" with "active audition" in a noisy environment, graphical and network structure modeling approaches, uncertainty in network communications, the geometric approach to blind signal processing, game-theoretic learning algorithms, and observable operator models (OOMs) as an alternative to hidden Markov models (HMMs).

    • Hardcover $11.75
  • Nearest-Neighbor Methods in Learning and Vision

    Nearest-Neighbor Methods in Learning and Vision

    Theory and Practice

    Gregory Shakhnarovich, Trevor Darrell, and Piotr Indyk

    Advances in computational geometry and machine learning that offer new methods for search, regression, and classification with large amounts of high-dimensional data.

    Regression and classification methods based on similarity of the input to stored examples have not been widely used in applications involving very large sets of high-dimensional data. Recent advances in computational geometry and machine learning, however, may alleviate the problems in using these methods on large data sets. This volume presents theoretical and practical discussions of nearest-neighbor (NN) methods in machine learning and examines computer vision as an application domain in which the benefit of these advanced methods is often dramatic. It brings together contributions from researchers in theory of computation, machine learning, and computer vision with the goals of bridging the gaps between disciplines and presenting state-of-the-art methods for emerging applications. The contributors focus on the importance of designing algorithms for NN search, and for the related classification, regression, and retrieval tasks, that remain efficient even as the number of points or the dimensionality of the data grows very large. The book begins with two theoretical chapters on computational geometry and then explores ways to make the NN approach practicable in machine learning applications where the dimensionality of the data and the size of the data sets make the naïve methods for NN search prohibitively expensive. The final chapters describe successful applications of an NN algorithm, locality-sensitive hashing (LSH), to vision tasks.

    • Hardcover $45.00
  • Advances in Minimum Description Length

    Advances in Minimum Description Length

    Theory and Applications

    Peter D. Grünwald, Jay Injae Myung, and Mark A. Pitt

    A source book for state-of-the-art MDL, including an extensive tutorial and recent theoretical advances and practical applications in fields ranging from bioinformatics to psychology.

    The process of inductive inference—to infer general laws and principles from particular instances—is the basis of statistical modeling, pattern recognition, and machine learning. The Minimum Descriptive Length (MDL) principle, a powerful method of inductive inference, holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data—that the more we are able to compress the data, the more we learn about the regularities underlying the data. Advances in Minimum Description Length is a sourcebook that will introduce the scientific community to the foundations of MDL, recent theoretical advances, and practical applications. The book begins with an extensive tutorial on MDL, covering its theoretical underpinnings, practical implications as well as its various interpretations, and its underlying philosophy. The tutorial includes a brief history of MDL—from its roots in the notion of Kolmogorov complexity to the beginning of MDL proper. The book then presents recent theoretical advances, introducing modern MDL methods in a way that is accessible to readers from many different scientific fields. The book concludes with examples of how to apply MDL in research settings that range from bioinformatics and machine learning to psychology.

    • Hardcover $11.75
  • Exploratory Analysis and Data Modeling in Functional Neuroimaging

    Exploratory Analysis and Data Modeling in Functional Neuroimaging

    Friedrich T. Sommer and Andrzej Wichert

    An overview of theoretical and computational approaches to neuroimaging.

    Functional imaging tools such as fMRI (functional magnetic resonance imaging), PET (positron emission tomography), EEG (electro-encephalogram), and MEG (magneto-encephalogram) allow researchers to record activity in the working brain and draw inferences about how the brain functions. This book provides a survey of theoretical and computational approaches to neuroimaging, including inferential, exploratory, and causal methods of data analysis; theories of cerebral function; and biophysical and computational models of neural nets. It also emphasizes the close relationships between different approaches, for example, between causal data analysis and biophysical modeling, and between functional theories and computational models.

    • Hardcover $48.00
  • Probabilistic Models of the Brain

    Probabilistic Models of the Brain

    Perception and Neural Function

    Rajesh P.N. Rao, Bruno A. Olshausen, and Michael S. Lewicki

    A survey of probabilistic approaches to modeling and understanding brain function.

    Neurophysiological, neuroanatomical, and brain imaging studies have helped to shed light on how the brain transforms raw sensory information into a form that is useful for goal-directed behavior. A fundamental question that is seldom addressed by these studies, however, is why the brain uses the types of representations it does and what evolutionary advantage, if any, these representations confer. It is difficult to address such questions directly via animal experiments. A promising alternative is to use probabilistic principles such as maximum likelihood and Bayesian inference to derive models of brain function.

    This book surveys some of the current probabilistic approaches to modeling and understanding brain function. Although most of the examples focus on vision, many of the models and techniques are applicable to other modalities as well. The book presents top-down computational models as well as bottom-up neurally motivated models of brain function. The topics covered include Bayesian and information-theoretic models of perception, probabilistic theories of neural coding and spike timing, computational models of lateral and cortico-cortical feedback connections, and the development of receptive field properties from natural signals.

    • Hardcover $58.00
    • Paperback $25.00
  • Advanced Mean Field Methods

    Advanced Mean Field Methods

    Theory and Practice

    Manfred Opper and David Saad

    This book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

    A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models.

    Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models.

    Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

    • Hardcover $11.75
  • Advances in Large-Margin Classifiers

    Advances in Large-Margin Classifiers

    Alexander J. Smola, Peter Bartlett, Bernhard Schölkopf, and Dale Schuurmans

    The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research.

    The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification—that is, a scale parameter—rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms. The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.

    • Hardcover $55.00

Ronald W. Jones

Ronald Jones is Professor of Economics, Emeritus, in the Department of Economics at the University of Rochester.

  • Globalization and the Theory of Input Trade

    Globalization and the Theory of Input Trade

    Ronald W. Jones

    Ronald Jones suggests how the basic core of real trade theory can be modified to take into account the increased international mobility of inputs and productive factors.

    As trade liberalization and the fragmentation of production processes promote greater international exchange of inputs, economists must adjust their thinking on trade issues. Transport costs have plummeted, and the difficulties of communicating between locales half a world apart have practically vanished. In this book Ronald Jones suggests how the basic core of real trade theory can be modified to take into account the increased international mobility of inputs and productive factors. He emphasizes the role of country "hinterlands" and how it is related to agglomeration effects in determining the location of economic activity. After discussing the positive aspects of enhanced mobility for output patterns and market prices, Jones evaluates the significance of globalization for governmental trade policies and public attitudes about regional alliances.

    • Hardcover $32.00