This book's title may be a little confusing; one might suspect it offers another study of transhumanism or suggests forms of human enhancement, but this is certainly not the case. It belongs to a different genre, dealing with the influences and changes that technical advances bring to the human condition. Other recent titles have pointed in a similar direction: James Katz, Machines That Become Us (2006); Max Tegmark, Life 3.0: Being Human in the Age of Artificial Intelligence (2017); Hannah Fry, Hello World: How to Be Human in the Age of the Machine (2018); Shoshana Zuboff, The Age of Surveillance Capitalism (2019); and Steffan Mau, The Metric Society: On the Quantification of the Social (2019). All of them share a critical and warning tone in the face of a big cultural and social trend that affects basic human traits, like freedom and purpose, and invite us to rethink humanity in its present and future forms, with strong implications for politics, economy, and—we could add—religion. Indeed, religious faith would not be the same as before the big changes that this growing literature reveals; this is a point that the dialogue between science and religion should not ignore.

To return to the book under review: it is divided into four parts, with fourteen chapters in all, plus five appendices illustrating or deepening several aspects. The style is roughly a mix between scientific journalism and social criticism, revealing hidden strategies and moves that end in manipulating and controlling our lives and minds. The book becomes a warning against dangerous tendencies, with advice about the resistance required when one has become aware of those trends.

Most of the book's content is devoted to describing the ways in which new technologies influence our lives—or, rather, how human lives and activities become part of a huge information system. The concept of “function creep” plays a substantial role in this argument. It is defined as a process that enlarges the uses and applications of a device or program, to extend its range and even to develop new functions beyond the initial purpose. Through many examples and instances, the book describes how human nature becomes channeled into useful patterns and data sets at the service of commercial and political interests. Personal autonomy and meaning appear threatened in a process in which humanity loses the central status of subject, and being in control, to become more like a product or a means for more or less implicit aims and interests at other levels. Such a trend clearly implies a downgrading, almost an inversion, of the human condition when compared with the high aspirations that both Christian and Humanist traditions claimed for it.

The point is that the new technologies have rendered possible levels of surveillance and control in many personal and social settings which could be only remotely conceived by former ideologies or totalitarian regimes. To some extent the “panopticon” model, described in Michael Foucault's critical analysis, revealing a dark modern trend, finds a final and almost complete expression in this situation, in which humans become just part of a totally deterministic system.

Many instances are presented in detail through these pages: an electronic watch, which controls health indicators or other “fitness tracking tools,” becomes successively a means to check private living standards and to provide useful information for the design of tailored publicity campaigns or the ranking of possible customers in relation to different business activities, like health services, sport, or leisure. “Outsourcing” is the label for strategies aimed at better assessing performance in subjects by “third parties”—often new technologies—but their possible utility appears threatened by less evident secondary effects, like passivity, ignorance, and diminished responsibility. Even devices that assist in navigation, like GPS, entail these risks, as they become tools for surveillance and entail privacy loss. Such processes suggests the “slippery slope” simile, or the expanding consequences of a seemingly small action or decision, as it becomes easier to move in the direction involved, following the path of “progress.”

The relationship of humans with tools is at the basis of cultural evolution, and science and technology play a big role there. However, when directly applied to real people, the situation can look quite worrying. Take, for instance, Taylorism, or a program to increase productivity and efficiency; such a legitimate aim can pervert human identity and relationships, instrumentalizing every activity. That trend is perceived in the way contracts are simplified online to gain time and minimize costs, but at the expense of automatizing human behavior. A similar process emerges in procedures that ostensibly help to expand our minds, which are represented as helpful to our gaining more cognitive power, but at the same time entail loss of control, and dependency. The world of new media likewise becomes suspect, as they shape an “environment” tailored to our expectations and our search for happiness, but operate in an interactive way that extracts useful information to be reprocessed so as to reconfigure our own environments and reinforce our tastes, which have been rendered algorithmically predictable and easy to nurture and manipulate. In a similar way, the so‐called “techno‐smart environments” simultaneously facilitate and design many aspects of life in a way that imposes models and parameters upon everybody, and even mediates personal relationships with the pretext of optimizing them. Quantification and use of algorithms promise to simplify our lives and render them more efficient, with all their tracking activities, but again the side effect is an engineering of our lives and activities for other purposes, commercial or political.

The last chapters of the book offer interesting reflections on the specifics of human intelligence and freedom, and the limits of automatization or reduction toward a machine‐like status. The main dilemma that arises, following Robert Nozick's thought experiment on the “Experience Machine,” which would provide disembodied subjective happiness, is well expressed in a question: “Would maximizing human happiness at minimal social cost, through the Experience Machine n.o [the updated Internet‐linked version], justify forcing everyone to live a fully determined life?” (p. 259). As expected, the book ends with an appeal that we rebel against these trends and regain freedom and control, once more affirming humanistic values, based on practical strategies and means to contrast the ongoing dominant culture.

Religion is completely omitted from this otherwise insightful and provoking analysis. Is it worthwhile wondering about such a symptomatic absence and how it could be connected with the issues at stake? At first sight, it seems a sign of deep secularization in the culture under examination: Religion does not play any role in such a highly developed model. We could even conceive some applications in the same vein to build user‐friendly environments for religious customers, where religious services could become another human activity being tailored and hence controlled for “right” and profitable consumption. Yet, this is not the real point. Moving deeper, the issue that arises from the worrisome panorama depicted in these pages is whether human values and humanistic models are more endangered when less religion is left, or it becomes scarcely available as a resource to avoid dehumanizing processes. Perhaps this is not just a coincidence: The automatized society that provides all we need may be the only utopia left after religion and its future hopes have faded away, or been dismissed as redundant. That perception should promote a more conscious critical role of religious faith, in these new contexts in which science and technology cannot fix everything but are rather suspected of being involved in dangerous practices. The situation described, and its challenges, call more to the forefront the function of religion in advanced societies, and its capacity to address issues other social systems cannot tackle on their own.