domingo, 29 de novembro de 2020

Reflexão - Helena Matos

Helena Matos (Observador)


A demagogia em torno da violência policial 

Questionamos o SNS de cada vez que um médico falha? Pomos em causa a justiça por causa de um juiz corrupto? Porquê então a desconfiança em torno das forças policiais de cada vez que um agente erra?

29 nov 2020, 08:08

 

 

       

Das mil vagas abertas este ano no concurso de recrutamento para agentes da PSP apenas foram preenchidas 793. Nunca nada de semelhante aconteceu no nosso país. Habitualmente o número de candidatos aprovados superava largamente o número de vagas, o que levava à constituição de reservas de recrutamento, com os candidatos aprovados a ficarem como excedentários. Agora os candidatos nem chegam para preencher as vagas quanto mais para manter reservas.

 

Porque aconteceu isto? Algumas das explicações mais frequentes e directas prendem-se com asquestões materiais, sobretudo no início da carreira: o baixo vencimento ou a colocação longe de casa.Mas não só. Basta seguir as notícias e as fúrias soltas das redes sociais para perceber que será cada vez mais difícil alguém escolher ser polícia.

Quem ler, ver ou ouvir os nossos noticiários tem razões para acreditar que existe um problema das autoridades policiais neste nosso mundo democrático. (E só neste, claro, porque aceita-se como uma fatalidade a repressão nas ditaduras, como na China, na Turquia ou em Cuba.) Enquanto escrevo a França está mais uma vez a ferro e fogo por causa da nova lei de segurança apresentada pelo governo, para mais nos mesmos dias em que foi conhecido um vídeo onde se vê um homem a ser agredido por três polícias. Em Portugal, foi confirmada a condenação de oito polícias da Esquadra de Alfragide que, segundo deu como provado o Tribunal da Relação de Lisboa, a 5 de Outubro de 2015 terão agredido seis jovens. Em Espanha, com o caos instalados nas Canárias após a chegada este ano de mais de 18 mil imigrantes de Marrocos,exploram-se as falhas das autoridades policiais no controlo dos imigrantespara iludir o estrondoso erro político do governo de Sanchez quando o seu governo acolheu com pompa e propaganda os “refugiados” do Aquarius: as mafias da imigração perceberam que tinham rota livre e usam-na.

Por cá essas 207 vagas que ficaram por preeencher no concurso de admissão à PSP dão conta de um mal estar de que não se fala. Nem se quer falar. Afinal no mundo-bolha dos jornalistas, políticos e activistas (esse mundo-bolha em que se desliza por ciclovias e não em comboios apinhados, esse mundo-bolha em que só se ouvem as músicas do mundo e não o inquietante ruído de passos num corredor do metro, esse mundo-bolha em que se fazem compras em lojas tradicionais e não em supermercados de baixo preço em cujo interior os seguranças, quase sempre negros, parecem cada vez mais altos) nesse mundo-bolha, repito, os polícias só são notícia quando erram, como aconteceu na esquadra de Alfragide. Caso contrário, por exemplo quando são atraídos a emboscadas, resta-lhes o Correio da Manhãe umas notícias breves para que os fact check possam garantir que se abordou o assunto.

 

 

Qualquer notícia acerca de um abuso cometido por polícias, seja em Portugal, em França ou nos EUA leva a que, passo seguinte, se questione a própria autoridade policial. Despedimento imediato ou a expulsão na hora dos agentes envolvidos é o mínimo que se exige. O passo seguinte tem sido apresentar-se a ausência de policiamento como algo de positivo: «Na “Zona Autónoma” de Seattle a polícia não manda, a comida é grátis e as ideias fluem» lia-se este ano no Expresso aquando dos tumultos que tiveram lugar nos EUA. Como era mais que expectável não passaram muitos dias até que se impusesse a lei do mais forte nas tais zonas livres de polícia: pilhagens, tiroteios, agressões sexuais… foram acontecendo mas na falta de um polícia, de preferência branco, para culpar, essa violência não gerou grande comoção. Em França, proliferaram as zonas onde “a polícia não manda” nem entra, são os chamados territórios perdidos da República. O alarme só tocou nas instituições quando o presidente Macron teve de ser retirado de um teatro pelos mesmos polícias cuja actividade inicialmente depreciara.

Entendamo-nos, é óbvio que existe violência policial. E abusos praticados por polícias. Mas também temos de entender que cavalgando casos de violência policial estamos a viver uma tentativa de deslegitimação das autoridades policiais nos países democráticos. Se a isto, que já não é pouco, se juntar a racialização de tudo o que acontece temos um inferno pronto a servir. Perante um quadro destes quem é se candidata à PSP sabendo que nos primeiros anos de serviço poderá ser colocado na periferia de Lisboa com um salário base que não chega aos 800 euros? E, mais importante ainda, como é a vida nessas periferias quando o policiamento falha? Ou nesse interior que, visto do mundo-bolha, parece um bilhete-postal bucólico mas tem esquadras e postos da GNR fechados à noite por falta de viaturas e homens?

Da próximas vez que se ouvir repetir como um dogma inquestionável – não há liberdade sem segurança – não podemos esquecer: este ano das mil vagas abertas para agentes da PSP apenas foram preenchidas 793. Para o ano como será?

PS.Onze milhões de euros por uma conferência que não vai passar de um evento on line? A Câmara Municipal de Lisboa e o Governo português resolveram manter o pagamento de 11 milhões de euros relativamente à edição da Web Summit de 2020. Só que a Web Summit de 2020 vai ser apenas um encontro digital. A śério que não vai ser invocada a cláusula do contrato que permite suspender o pagamento em caso de força maior como uma pandemia?Alguém está a gozar com os contribuintes portugueses!


Comentário (Rui Lourenço M)

A chamada chuva no molhado.

Quantas vezes se "atirou" para cima da mesa a problemática da desautoridade?
Uma criatura que está ministro (a termo certo) porque para isso foi nomeado que diz, em conferência de imprensa estando presente um general, que os assuntos militares são um assunto demasiado sério para ser gerido pelos próprios militares. E o deficiente mental, fardado com a quinquilharia ao peito, mantém o sorriso boçal de quem agradece ser-lhe concedido poder consumir um pouco de oxigénio e ocupar algum espaço (o estritamente necessário, naturalmente), claro que, presumo, de regresso ao seu local de trabalho - não me parece que possa considerar-se posto de comando - continuará bucolicamente a assinar documentação administrativa elaborada numa secretaria de estado qualquer.
Quantas vezes se "atirou" para cima da mesa o caos em que se estava a tornar a PSP, e que se estava a arrastar a Guarda para o mesmo atoleiro?
Uma Polícia/Guarda que desvia o efectivo da Investigação Criminal para fazer de oficial de diligências?
Uma Guarda que apaga fogos, literalmente?
Uma Guarda que "acha" que acompanha os novos tempos por dinamizar campanhas de deficientes para deficientes?
Policiamento de proximidade enquanto se instauram super-esquadras?
Ah, claro. Seria para reduzir pessoal administrativo e concentrar os meios, pois... 
Patrulhamento móvel, auto... gostaria que algum iluminado me explicasse o que vê um agente de autoridade quando se desloca de carro, com os vidros fechados (quero presumir que pelo frio) ao longo de uma via que supostamente estará a patrulhar.
No interior acabaram as patrulhas apeadas ou a cavalo, passou a utilizar-se o jeep. E isso restringiu o patrulhamento às "picadas" onde o Jeep conseguia chegar. Depois, até o jeep se tornou desconfortável, e passou-se aos ligeiros de passageiros, com ar condicionado e rádio, o rádio é muito importante (o rádio comercial, entenda-se) e o patrulhamento restringiu-se às vias principais.
Ideias cretinas, rebatidas por quem "anda na rua", mas aceites e instauradas pelos tais portadores de quinquilharia que nunca puseram os pés na rua.
Policiamento de proximidade era o Polícia de Giro, da Esquadra do bairro. Eram o Guarda Nocturno que no início de serviço ia "levantar" a 6,35, para exibição. Porque o meio dissuasor era o cassetete de madeira, devidamente forrado a cabedal para evitar magoar o cidadão com alguma farpa, e não por motivos estéticos como alguém pensou.
Esses conheciam os moradores, as viaturas, os hábitos no bairro. Tratavam o cidadão pelo nome, e quando se deparavam com algo menos normal iam verificar, bater à porta do cidadão assegurando-se que a luz acesa e a janela entreaberta às 3 da manhã se devia a uma insónia. Conversava-se um bocado, bebia-se um café e o giro continuava, com toda a gente satisfeita.
Ou o parente que ia "levantar" a criança que se tinha "portado mal" ao Posto. O mocinho aparecia um pouco amaciado, com ar contricto de quem não ia reincidir, e o, quase invariável, comentário do Comandante do Posto seria:"O rapaz não é mau, não lhe pode é dar tanta rédea. Segure lá o rapaz para não termos que ser nós a aparelhá-lo".
E o "rapaz" lá ia para casa, provavelmente para acabar de ser amaciado pelo parente. E tudo corria bem.
O Agente de Autoridade era isso mesmo, e tinha orgulho em sê-lo.

Há já alguns anos o Mayor de S. Francisco, Califórnia, declarou em discurso público que a polícia local controlava as principais vias da cidade entre o nascer e o pôr-do-Sol. 
Teriam instaurado este regime de "policiamento de proximidade" há mais tempo. 
Cá no burgo, foi um processo que se agudizou quando o actual duplo (pelo malabarismo e pelo Malabar) malabarista pontificava como ministro da administração interna. Para os menos atentos, foi quando a estrutura da Guarda foi reformada, no intuito assumido de equiparação ao dispositivo da PSP.
O sonho destes cretinos é de uma polícia única, totalmente dependente das estruturas políticas locais. E esse modelo só resultou em regimes totalitários de esquerda, com as consequências que se conhecem.

Mas é do que o rebanho gosta.
Se nada se alterar neste rumo, a caça irá abrir um dia destes.

Abraço

sábado, 28 de novembro de 2020

Dança - Clowns

Na RTP2, um espectáculo de Hofech Schechter. O Diogo participou, em 2012 em Londres, no Barbican, num espectáculo dele.





quinta-feira, 26 de novembro de 2020

Reflexão - Firstthings (Believe it or not) 05.2010

(sublinhados meus) - We have to hear and read other ideas...


Believe It or Not

by 
I think I am very close to concluding that this whole “New Atheism” movement is only a passing fad—not the cultural watershed its purveyors imagine it to be, but simply one of those occasional and inexplicable marketing vogues that inevitably go the way of pet rocks, disco, prime-time soaps, and  The Bridges of Madison County. This is not because I necessarily think the current “marketplace of ideas” particularly good at sorting out wise arguments from foolish. But the latest trend in à la mode godlessness, it seems to me, has by now proved itself to be so intellectually and morally trivial that it has to be classified as just a form of light entertainment, and popular culture always tires of its diversions sooner or later and moves on to other, equally ephemeral toys.
Take, for instance, the recently published  50 Voices of Disbelief: Why We Are Atheists. Simple probability, surely, would seem to dictate that a collection of essays by fifty fairly intelligent and zealous atheists would contain at least one logically compelling, deeply informed, morally profound, or conceptually arresting argument for not believing in God. Certainly that was my hope in picking it up. Instead, I came away from the whole drab assemblage of preachments and preenings feeling rather as if I had just left a large banquet at which I had been made to dine entirely on crushed ice and water vapor.
To be fair, the shallowness is not evenly distributed. Some of the writers exhibit a measure of wholesome tentativeness in making their cases, and as a rule the quality of the essays is inversely proportionate to the air of authority their authors affect. For this reason, the philosophers—who are no better than their fellow contributors at reasoning, but who have better training in giving even specious arguments some appearance of systematic form—tend to come off as the most insufferable contributors. Nicholas Everitt and Stephen Law recycle the old (and incorrigibly impressionistic) argument that claims of God’s omnipotence seem incompatible with claims of his goodness. Michael Tooley does not like the picture of Jesus that emerges from the gospels, at least as he reads them. Christine Overall notes that her prayers as a child were never answered; ergo, there is no God. A. C. Grayling flings a few of his favorite papier-mâché caricatures around. Laura Purdy mistakes hysterical fear of the religious right for a rational argument. Graham Oppy simply provides a précis of his personal creed, which I assume is supposed to be compelling because its paragraphs are numbered. J. J. C. Smart finds miracles scientifically implausible (gosh, who could have seen that coming?). And so on. Adèle Mercier comes closest to making an interesting argument—that believers do not really believe what they think they believe—but it soon collapses under the weight of its own baseless presuppositions.
The scientists fare almost as poorly. Among these, Victor Stenger is the most recklessly self-confident, but his inability to differentiate the physical distinction between something and nothing (in the sense of “not anything as such”) from the logical distinction between existence and nonexistence renders his argument empty. The contributors drawn from other fields offer nothing better. The Amazing Randi, being a magician, knows that there is quite a lot of credulity out there. The historian of science Michael Shermer notes that there are many, many different and even contradictory systems of belief. The journalist Emma Tom had a psychotic scripture teacher when she was a girl.  Et, as they say,  cetera. The whole project probably reaches its  reductio ad absurdum when the science-fiction writer Sean Williams explains that he learned to reject supernaturalism in large part from having grown up watching  Doctor Who.
So it goes. In the end the book as a whole adds up to absolutely nothing—as, frankly, do all the books in this new genre—and I have to say I find this all somewhat depressing. For one thing, it seems obvious to me that the peculiar vapidity of New Atheist literature is simply a reflection of the more general vapidity of all public religious discourse these days, believing and unbelieving alike. In part, of course, this is because the modern media encourage only fragmentary, sloganeering, and emotive debates, but it is also because centuries of the incremental secularization of society have left us with a shared grammar that is perhaps no longer adequate to the kinds of claims that either reflective faith or reflective faithlessness makes.
The principal source of my melancholy, however, is my firm conviction that today’s most obstreperous infidels lack the courage, moral intelligence, and thoughtfulness of their forefathers in faithlessness. What I find chiefly offensive about them is not that they are skeptics or atheists; rather, it is that they are not skeptics at all and have purchased their atheism cheaply, with the sort of boorish arrogance that might make a man believe himself a great strategist because his tanks overwhelmed a town of unarmed peasants, or a great lover because he can afford the price of admission to a brothel. So long as one can choose one’s conquests in advance, taking always the paths of least resistance, one can always imagine oneself a Napoleon or a Casanova (and even better: the one without a Waterloo, the other without the clap).
But how long can any soul delight in victories of that sort? And how long should we waste our time with the sheer banality of the New Atheists—with, that is, their childishly Manichean view of history, their lack of any tragic sense, their indifference to the cultural contingency of moral “truths,” their wanton incuriosity, their vague babblings about “religion” in the abstract, and their absurd optimism regarding the future they long for?
I am not—honestly, I am not—simply being dismissive here. The utter inconsequentiality of contemporary atheism is a social and spiritual catastrophe. Something splendid and irreplaceable has taken leave of our culture—some great moral and intellectual capacity that once inspired the more heroic expressions of belief and unbelief alike. Skepticism and atheism are, at least in their highest manifestations, noble, precious, and even necessary traditions, and even the most fervent of believers should acknowledge that both are often inspired by a profound moral alarm at evil and suffering, at the corruption of religious institutions, at psychological terrorism, at injustices either prompted or abetted by religious doctrines, at arid dogmatisms and inane fideisms, and at worldly power wielded in the name of otherworldly goods. In the best kinds of unbelief, there is something of the moral grandeur of the prophets—a deep and admirable abhorrence of those vicious idolatries that enslave minds and justify our worst cruelties.
But a true skeptic is also someone who understands that an attitude of critical suspicion is quite different from the glib abandonment of one vision of absolute truth for another—say, fundamentalist Christianity for fundamentalist materialism or something vaguely and inaccurately called “humanism.” Hume, for instance, never traded one dogmatism for another, or one facile certitude for another. He understood how radical were the implications of the skepticism he recommended, and how they struck at the foundations not only of unthinking faith, but of proud rationality as well.
A truly profound atheist is someone who has taken the trouble to understand, in its most sophisticated forms, the belief he or she rejects, and to understand the consequences of that rejection. Among the New Atheists, there is no one of whom this can be said, and the movement as a whole has yet to produce a single book or essay that is anything more than an insipidly doctrinaire and appallingly ignorant diatribe. 😂😂
If that seems a harsh judgment, I can only say that I have arrived at it honestly. In the course of writing a book published just this last year, I dutifully acquainted myself not only with all the recent New Atheist bestsellers, but also with a whole constellation of other texts in the same line, and I did so, I believe, without prejudice. No matter how patiently I read, though, and no matter how Herculean the efforts I made at sympathy, I simply could not find many intellectually serious arguments in their pages, and I came finally to believe that their authors were not much concerned to make any.
What I did take away from the experience was a fairly good sense of the real scope and ambition of the New Atheist project. I came to realize that the whole enterprise, when purged of its hugely preponderant alloy of sanctimonious bombast, is reducible to only a handful of arguments, most of which consist in simple category mistakes or the kind of historical oversimplifications that are either demonstrably false or irrelevantly true. And arguments of that sort are easily dismissed, if one is hardy enough to go on pointing out the obvious with sufficient indefatigability.

The only points at which the New Atheists seem to invite any serious intellectual engagement are those at which they try to demonstrate that all the traditional metaphysical arguments for the reality of God fail. At least, this  should be their most powerful line of critique, and no doubt would be if any of them could demonstrate a respectable understanding of those traditional metaphysical arguments, as well as an ability to refute them. Curiously enough, however, not even the trained philosophers among them seem able to do this. And this is, as far as I can tell, as much a result of indolence as of philosophical ineptitude. The insouciance with which, for instance, Daniel Dennett tends to approach such matters is so torpid as to verge on the reptilian. He scarcely bothers even to get the traditional “theistic” arguments right, and the few ripostes he ventures are often the ones most easily discredited.

As a rule, the New Atheists’ concept of God is simply that of some very immense and powerful being among other beings, who serves as the first cause of all other things only in the sense that he is prior to and larger than all other causes. That is, the New Atheists are concerned with the sort of God believed in by seventeenth- and eighteenth-century Deists. Dawkins, for instance, even cites with approval the old village atheist’s cavil that omniscience and omnipotence are incompatible because a God who infallibly foresaw the future would be impotent to change it—as though Christians, Jews, Muslims, Hindus, Sikhs, and so forth understood God simply as some temporal being of interminable duration who knows things as we do, as external objects of cognition, mediated to him under the conditions of space and time.
Thus, the New Atheists’ favorite argument turns out to be just a version of the old argument from infinite regress: If you try to explain the existence of the universe by asserting God created it, you have solved nothing because then you are obliged to say where God came from, and so on  ad infinitum, one turtle after another, all the way down. This is a line of attack with a long pedigree, admittedly. John Stuart Mill learned it at his father’s knee. Bertrand Russell thought it more than sufficient to put paid to the whole God issue once and for all. Dennett thinks it as unanswerable today as when Hume first advanced it—although, as a professed admirer of Hume, he might have noticed that Hume quite explicitly treats it as a formidable objection only to the God of Deism, not to the God of “traditional metaphysics.” In truth, though, there could hardly be a weaker argument. To use a feeble analogy, it is rather like asserting that it is inadequate to say that light is the cause of illumination because one is then obliged to say what it is that illuminates the light, and so on  ad infinitum.
The most venerable metaphysical claims about God do not simply shift priority from one kind of thing (say, a teacup or the universe) to another thing that just happens to be much bigger and come much earlier (some discrete, very large gentleman who preexists teacups and universes alike). These claims start, rather, from the fairly elementary observation that nothing contingent, composite, finite, temporal, complex, and mutable can account for its own existence, and that even an infinite series of such things can never be the source or ground of its own being, but must depend on some source of actuality beyond itself. Thus, abstracting from the universal conditions of contingency, one very well may (and perhaps must) conclude that all things are sustained in being by an  absolute plenitude of actuality, whose very essence is being as such: not a “supreme being,” not another  thing within or alongside the universe, but the infinite act of being itself, the one eternal and transcendent source of all existence and knowledge, in which all finite being participates.
It is immaterial whether one is wholly convinced by such reasoning. Even its most ardent proponents would have to acknowledge that it is an almost entirely negative deduction, obedient only to something like Sherlock Holmes’ maxim that “when you have eliminated the impossible, whatever remains,  however improbable, must be the truth.” It certainly says nearly nothing about who or what God  is.
But such reasoning is also certainly  not subject to the objection from infinite regress. It is not logically requisite for anyone, on observing that contingent reality must depend on absolute reality, to say then what the absolute depends on or, on asserting the participation of finite beings in infinite being, further to explain what it is that makes being to be. Other arguments are called for, as Hume knew. And only a complete failure to grasp the most basic philosophical terms of the conversation could prompt this strange inversion of logic, by which the argument from infinite regress—traditionally and correctly regarded as the most powerful objection to pure materialism—is now treated as an irrefutable argument against belief in God.
But something worse than mere misunderstanding lies at the base of Dawkins’ own special version of the argument from infinite regress—a version in which he takes a pride of almost maternal fierceness. Any “being,” he asserts, capable of exercising total control over the universe would have to be an extremely complex being, and because we know that complex beings must evolve from simpler beings and that the probability of a being as complex as that evolving is vanishingly minute, it is almost certain that no God exists.  Q.E.D. But, of course, this scarcely rises to the level of nonsense. We can all happily concede that no complex, ubiquitous, omniscient, and omnipotent superbeing, inhabiting the physical cosmos and subject to the rules of evolution, exists. But who has ever suggested the contrary?
Numerous attempts have been made, by the way, to apprise Dawkins of what the traditional definition of divine simplicity implies, and of how it logically follows from the very idea of transcendence, and to explain to him what it means to speak of God as the transcendent fullness of actuality, and how this differs in kind from talk of quantitative degrees of composite complexity. But all the evidence suggests that Dawkins has never understood the point being made, and it is his unfortunate habit contemptuously to dismiss as meaningless concepts whose meanings elude him. Frankly, going solely on the record of his published work, it would be rash to assume that Dawkins has ever learned how to reason his way to the end of a simple syllogism.
To appreciate the true spirit of the New Atheism, however, and to take proper measure of its intellectual depth, one really has to turn to Christopher Hitchens. Admittedly, he is the most egregiously slapdash of the New Atheists, as well as (not coincidentally) the most entertaining, but I take this as proof that he is also the least self-deluding. His God Is Not Great shows no sign whatsoever that he ever intended anything other than a rollicking burlesque, without so much as a pretense of logical order or scholarly rigor. His sporadic forays into philosophical argument suggest not only that he has sailed into unfamiliar waters, but also that he is simply not very interested in any of it. His occasional observations on Hume and Kant make it obvious that he has not really read either very closely. He apparently believes that Nietzsche, in announcing the death of God, literally meant to suggest that the supreme being named God had somehow met his demise. The title of one of the chapters in  God Is Not Great is “The Metaphysical Claims of Religion Are False,” but nowhere in that chapter does Hitchens actually say what those claims or their flaws are.
On matters of simple historical and textual fact, moreover, Hitchens’ book is so extraordinarily crowded with errors that one soon gives up counting them. Just to skim a few off the surface: He speaks of the ethos of Dietrich Bonhoeffer as “an admirable but nebulous humanism,” which is roughly on a par with saying that Gandhi was an apostle of the ruthless conquest and spoliation of weaker peoples. He conflates the histories of the first and fourth crusades. He repeats as fact the long discredited myth that Christians destroyed the works of Aristotle and Lucretius, or systematically burned the books of pagan antiquity, which is the very opposite of what did happen. He speaks of the traditional hostility of “religion” (whatever that may be) to medicine, despite the monastic origins of the modern hospital and the involvement of Christian missions in medical research and medical care from the fourth century to the present. He tells us that  countless lives were lost in the early centuries of the Church over disputes regarding which gospels were legitimate (the actual number of lives lost is zero). He asserts that Myles Coverdale and John Wycliffe were burned alive at the stake, although both men died of natural causes. He knows that the last twelve verses of Mark 16 are a late addition to the text, but he imagines this means that the entire account of the Resurrection is as well. He informs us that it is well known that Augustine was fond of the myth of the Wandering Jew, though Augustine died eight centuries before the legend was invented. And so on and so on (and so on).
In the end, though, all of this might be tolerated if Hitchens’ book exhibited some rough semblance of a rational argument. After all, there really is a great deal to despise in the history of religion, even if Hitchens gets almost all the particular details extravagantly wrong. To be perfectly honest, however, I cannot tell what Hitchens’ central argument is. It is not even clear what he understands religion to be. For instance, he denounces female circumcision, commendably enough, but what—pray tell—has that got to do with religion? Clitoridectomy is a widespread cultural tradition of sub-Saharan Africa, but it belongs to no particular creed. Even more oddly, he takes indignant note of the plight of young Indian brides brutalized and occasionally murdered on account of insufficient dowries. We all, no doubt, share his horror, but what the hell is his point?
As best I can tell, Hitchens’ case against faith consists mostly in a series of anecdotal enthymemes—that is to say, syllogisms of which one premise has been suppressed. Unfortunately, in each case it turns out to be the major premise that is missing, so it is hard to guess what links the minor premise to the conclusion. One need only attempt to write out some of his arguments in traditional syllogistic style to see the difficulty:
Major Premise : [omitted]
Minor Premise : Evelyn Waugh was always something of a bastard, and his Catholic chauvinism often made him even worse.
Conclusion : “Religion” is evil.
Or:
Major Premise : [omitted]
Minor Premise : There are many bad men who are Buddhists.
Conclusion : All religious claims are false.
Or:
Major Premise : [omitted]
Minor Premise : Timothy Dwight opposed smallpox vaccinations.
Conclusion : There is no God.
One could, I imagine, counter with a series of contrary enthymemes. Perhaps:
Major Premise : [omitted]
Minor Premise : Early Christians built hospitals.
Conclusion : “Religion” is a good thing.
Or:
Major Premise : [omitted]
Minor Premise : Medieval scriptoria saved much of the literature of classical antiquity from total eclipse.
Conclusion : All religious claims are true.
Or:
Major Premise : [omitted]
Minor Premise : George Bernard Shaw opposed smallpox vaccinations.
Conclusion : There  is a God.
But this appears to get us nowhere. And, in the end, I doubt it matters.
The only really effective antidote to the dreariness of reading the New Atheists, it seems to me, is rereading Nietzsche. How much more immediate and troubling the force of his protest against Christianity seems when compared to theirs, even more than a century after his death. Perhaps his intellectual courage—his willingness to confront the implications of his renunciation of the Christian story of truth and the transcendent good without evasions or retreats—is rather a lot to ask of any other thinker, but it does rather make the atheist  chic of today look fairly craven by comparison.
Above all, Nietzsche understood how immense the consequences of the rise of Christianity had been, and how immense the consequences of its decline would be as well, and had the intelligence to know he could not fall back on polite moral certitudes to which he no longer had any right. Just as the Christian revolution created a new sensibility by inverting many of the highest values of the pagan past, so the decline of Christianity, Nietzsche knew, portends another, perhaps equally catastrophic shift in moral and cultural consciousness. His famous fable in  The Gay Science of the madman who announces God’s death is anything but a hymn of atheist triumphalism. In fact, the madman despairs of the mere atheists—those who merely do not believe—to whom he addresses his terrible proclamation. In their moral contentment, their ease of conscience, he sees an essential oafishness; they do not dread the death of God because they do not grasp that humanity’s heroic and insane act of repudiation has sponged away the horizon, torn down the heavens, left us with only the uncertain resources of our will with which to combat the infinity of meaninglessness that the universe now threatens to become.
Because he understood the nature of what had happened when Christianity entered history with the annunciation of the death of God on the cross, and the elevation of a Jewish peasant above all gods, Nietzsche understood also that the passing of Christian faith permits no return to pagan naivete, and he knew that this monstrous inversion of values created within us a conscience that the older order could never have incubated. He understood also that the death of God beyond us is the death of the human as such within us. If we are, after all, nothing but the fortuitous effects of physical causes, then the will is bound to no rational measure but itself, and who can imagine what sort of world will spring up from so unprecedented and so vertiginously uncertain a vision of reality?
For Nietzsche, therefore, the future that lies before us must be decided, and decided between only two possible paths: a final nihilism, which aspires to nothing beyond the momentary consolations of material contentment, or some great feat of creative will, inspired by a new and truly worldly mythos powerful enough to replace the old and discredited mythos of the Christian revolution (for him, of course, this meant the myth of the  Übermensch).
Perhaps; perhaps not. Where Nietzsche was almost certainly correct, however, was in recognizing that mere formal atheism was not yet the same thing as true unbelief. As he writes in  The Gay Science, “Once the Buddha was dead, people displayed his shadow for centuries afterwards in a cave, an immense and dreadful shadow. God is dead: —but as the human race is constituted, there will perhaps be caves for millennia yet where people will display his shadow. And we—we have yet to overcome his shadow!” It may appear that Nietzsche is here referring to “persons of faith”—those poor souls who continue to make their placid, bovine trek to church every week to worship a God who passed away long ago—but that is not his meaning.
He is referring principally to those who think they have eluded God simply by ceasing to believe in his existence. For Nietzsche, “scientism”—the belief that the modern scientific method is the only avenue of truth, one capable of providing moral truth or moral meaningis the worst dogmatism yet, and the most pathetic of all metaphysical nostalgias. ❓❓And it is, in his view, precisely men like the New Atheists, clinging as they do to those tenuous vestiges of Christian morality that they have absurdly denominated “humanism,” who shelter themselves in caves and venerate shadows. As they do not understand the past, or the nature of the spiritual revolution that has come and now gone for Western humanity, so they cannot begin to understand the peril of the future.
If I were to choose from among the New Atheists a single figure who to my mind epitomizes the spiritual chasm that separates Nietzsche’s unbelief from theirs, I think it would be the philosopher and essayist A. C. Grayling. For a short time I entertained the misguided hope that he might produce an atheist manifesto somewhat richer than the others currently on offer. Unfortunately, all his efforts in that direction suffer from the same defects as those of his fellows: the historical errors, the sententious moralism, the glib sophistry. Their great virtue, however, is that they are mercifully short. One essay of his in particular, called “Religion and Reason,” can be read in a matter of minutes and provides an almost perfect distillation of the whole New Atheist project.
The essay is even, at least momentarily, interesting. Couched at one juncture among its various arguments (all of which are pretty poor), there is something resembling a cogent point. Among the defenses of Christianity an apologist might adduce, says Grayling, would be a purely aesthetic cultural argument: But for Christianity, there would be no Renaissance art—no Annunciations or Madonnas—and would we not all be much the poorer if that were so? But, in fact, no, counters Grayling; we might rather profit from a far greater number of canvasses devoted to the lovely mythical themes of classical antiquity, and only a macabre sensibility could fail to see that “an Aphrodite emerging from the Paphian foam is an infinitely more life-enhancing image than a Deposition from the Cross.” Here Grayling almost achieves a Nietzschean moment of moral clarity.
Ignoring that leaden and almost perfectly ductile phrase “life-enhancing,” I, too—red of blood and rude of health—would have to say I generally prefer the sight of nubile beauty to that of a murdered man’s shattered corpse. The question of whether Grayling might be accused of a certain deficiency of tragic sense can be deferred here. But perhaps he would have done well, in choosing this comparison, to have reflected on the sheer strangeness, and the significance, of the historical and cultural changes that made it possible in the first place for the death of a common man at the hands of a duly appointed legal authority to become the captivating center of an entire civilization’s moral and aesthetic contemplations—and for the deaths of all common men and women perhaps to be invested thereby with a gravity that the ancient order would never have accorded them.
Here, displayed with an altogether elegant incomprehensibility in Grayling’s casual juxtaposition of the sea-born goddess and the crucified God (who is a crucified man), one catches a glimpse of the enigma of the Christian event, which Nietzsche understood and Grayling does not: the lightning bolt that broke from the cloudless sky of pagan antiquity, the long revolution that overturned the hierarchies of heaven and earth alike. One does not have to believe any of it, of course—the Christian story, its moral claims, its metaphysical systems, and so forth. But anyone who chooses to lament that event should also be willing, first, to see this image of the God-man, broken at the foot of the cross, for what it is, in the full mystery of its historical contingency, spiritual pathos, and moral novelty: that tender agony of the soul that finds the glory of God in the most abject and defeated of human forms. Only if one has succeeded in doing this can it be of any significance if one still, then, elects to turn away.
David Hart’s most recent book is Atheist Delusions: The Christian Revolution and Its Fashionable Enemies

Maradona

Há alguém que está errado e que, na minha opinião, não está a ter a visão correcta.

Porquê todo este barulho, este endeusamento à volta de Maradona? Ele foi um jogador que fez o que fez num jogo de futebol, ou seja, ganhou aldrabando, perante todo o mundo.

E em mais de uma ocasião...

E agora aquele mundo reconhece que é um grande jogador.

Não, não foi. Na verdade ele foi um drogado, um mentiroso e uma fraude.


Mas os media - sempre os media... -, precisam de vender, de sobreviver.

Não comigo nem com as pessoas que pensam e que têm conceitos de ética estruturados.

É este o exemplo que os adultos querem para as novas gerações, para os seus filhos?



Someone's wrong. Someone is seeing the wrong picture, I believe.

Why all this fuzz  around Maradona? He was a football player who did what he did in a match, that is to say, he won cheating in front of all the world. More than once...

And now the world recognizes that he was " a great player"? No!, as a matter of fact he was a junkie, he was a lyer, a fraud! 

But the press - always the press... -, must sell, must survive.

Sorry, not with me, not with people who think for themselves, who have well-structured ethical concepts.

Is this the example some adults want for young people, for their children?





 

Reflexão - Derek Thompson (The Atlantic)

Get Ready for the Great Urban Comeback

Visionary responses to catastrophes have changed city life for the better.

Illustration by Mark Harris*

On December 16, 1835, New York’s rivers turned to ice, and Lower Manhattan went up in flames. Smoke had first appeared curling through the windows of a five-story warehouse near the southern tip of Manhattan. Icy gales blew embers into nearby buildings, and within hours the central commercial district had become an urban bonfire visible more than 100 miles away.

To hear more feature stories, get the Audm iPhone app.

Firefighters were helpless. Wells and cisterns held little free-flowing water, and the rivers were frozen solid on a night when temperatures plunged, by one account, to 17 degrees below zero. The fire was contained only after Mayor Cornelius Lawrence ordered city officials to blow up structures surrounding it, starving the flames of fuel.

A new Manhattan would grow from the rubble—made of stone rather than wood, with wider streets and taller buildings. But the most important innovation lay outside the city. Forty-one miles to the north, New York officials acquired a large tract of land on both sides of the Croton River, in Westchester County. They built a dam on the river to create a 400-acre lake, and a system of underground tunnels to carry fresh water to every corner of New York City.

The engineering triumph known as the Croton Aqueduct opened in 1842. It gave firefighters an ample supply of free-flowing water, even in winter. More important, it brought clean drinking water to residents, who had suffered from one waterborne epidemic after another in previous years, and kick-started a revolution in hygiene. Over the next four decades, New York’s population quadrupled, to 1.2 million—the city was on its way to becoming a fully modern metropolis.

The 21st-century city is the child of catastrophe. The comforts and infrastructure we take for granted were born of age-old afflictions: fire, flood, pestilence. Our tall buildings, our subways, our subterranean conduits, our systems for bringing water in and taking it away, our building codes and public-health regulations—all were forged in the aftermath of urban disasters by civic leaders and citizen visionaries.

Natural and man-made disasters have shaped our greatest cities, and our ideas about human progress, for millennia. Once Rome’s ancient aqueducts were no longer functional—damaged first by invaders and then ravaged by time—the city’s population dwindled to a few tens of thousands, reviving only during the Renaissance, when engineers restored the flow of water. The Lisbon earthquake of 1755 proved so devastating that it caused Enlightenment philosophers such as Jean-Jacques Rousseau to question the very merits of urban civilization and call for a return to the natural world. But it also led to the birth of earthquake engineering, which has evolved to make San Francisco, Tokyo, and countless other cities more resilient.

Derek Thompson: Hygiene theater is a huge waste of time

America’s fractious and tragic response to the COVID-19 pandemic has made the nation look more like a failed state than like the richest country in world history. Doom-scrolling through morbid headlines in 2020, one could easily believe that we have lost our capacity for effective crisis response. And maybe we have. But a major crisis has a way of exposing what is broken and giving a new generation of leaders a chance to build something better. Sometimes the ramifications of their choices are wider than one might think.

The Invention of Public Health

As Charles Dickens famously described, British cities in the early years of the Industrial Revolution were grim and pestilential. London, Birmingham, Manchester, Leeds—they didn’t suffer from individual epidemics so much as from overlapping, never-ending waves of disease: influenza, typhoid, typhus, tuberculosis. They were also filled with human waste. It piled up in basements, spilled from gutters, rotted in the streets, and fouled rivers and canals. In Nottingham—the birthplace of the Luddite movement, which arose to protest textile automation—a typical gallon of river water contained 45 grams of solid effluent. Imagine a third of a cup of raw sewage in a gallon jug.

Read: Is ‘progress’ good for humanity?

No outbreak during the industrial age shocked British society as much as the cholera epidemic in 1832. In communities of 100,000 people or more, average life expectancy at birth fell to as low as 26 years. In response, a young government official named Edwin Chadwick, a member of the new Poor Law Commission, conducted an inquiry into urban sanitation. A homely, dyspeptic, and brilliant protégé of the utilitarian philosopher Jeremy Bentham, Chadwick had farsighted ideas for government. They included shortening the workday, shifting spending from prisons to “preventive policing,” and establishing government pensions. With a team of researchers, Chadwick undertook one of the earliest public-health investigations in history—a hodgepodge of mapmaking, census-taking, and dumpster diving. They looked at sewers, dumps, and waterways. They interviewed police officers, factory inspectors, and others as they explored the relationship between city design and disease proliferation.

illustration
Illustration by Mark Harris; images from Oxford Science Archive / Getty; British Library / Alamy

The final report, titled “The Sanitary Conditions of the Labouring Population of Great Britain,” published in 1842, caused a revolution. Conventional wisdom at the time held that disease was largely the result of individual moral shortcomings. Chadwick showed that disease arose from failures of the urban environment. Urban disease, he calculated, was creating more than 1 million new orphans in Britain each decade. The number of people who had died of poverty and disease in British cities in any given year in the 1830s, he found, was greater than the annual death toll of any military conflict in the empire’s history. The cholera outbreak was a major event that forced the British government to reckon with the costs of industrial capitalism. That reckoning would also change the way Western cities thought about the role of the state in ensuring public health.

The source of the cholera problem? All that filthy water. Chadwick recommended that the government improve drainage systems and create local councils to clear away refuse and “nuisance”—human and animal waste—from homes and streets. His investigation inspired two key pieces of national legislation, both passed in 1848: the Public Health Act and the Nuisances Removal and Diseases Prevention Act. A new national Board of Health kept the pressure on public authorities. The fruits of engineering (paved streets, clean water, sewage disposal) and of science (a better understanding of disease) led to healthier lives, and longer ones. Life expectancy reached 40 in England and Wales in 1880, and exceeded 60 in 1940.

Chadwick’s legacy went beyond longevity statistics. Although he is not often mentioned in the same breath as Karl Marx or Friedrich Engels, his work was instrumental in pushing forward the progressive revolution in Western government. Health care and income support, which account for the majority of spending by almost every developed economy in the 21st century, are descendants of Chadwick’s report. David Rosner, a history and public-health professor at Columbia University, puts it simply: “If I had to think of one person who truly changed the world in response to an urban crisis, I would name Edwin Chadwick. His population-based approach to the epidemics of the 1830s developed a whole new way of thinking about disease in the next half century. He invented an entire ethos of public health in the West.”

Why We Have Skyscrapers

Everyone knows the story: On the night of October 8, 1871, a fire broke out in a barn owned by Patrick and Catherine O’Leary in southwest Chicago. Legend blames a cow tipping over a lantern. Whatever the cause, gusty winds drove the fire northeast, toward Lake Michigan. In the go-go, ramshackle era of 19th-century expansion, two-thirds of Chicago’s structures were built of timber, making the city perfect kindling. In the course of three days, the fire devoured 20,000 buildings. Three hundred people died. A third of the city was left without shelter. The entire business district—three square miles—was a wasteland.

On October 11, as the city smoldered, the Chicago Tribune published an editorialwith an all-caps headline: cheer up. The newspaper went on: “In the midst of a calamity without parallel in the world’s history, looking upon the ashes of thirty years’ accumulations, the people of this once beautiful city have resolved that chicago shall rise again.” And, with astonishing speed, it did. By 1875, tourists arriving in Chicago looking for evidence of the fire complained that there was little to see. Within 20 years, Chicago’s population tripled, to 1 million. And by the end of the century, the fire-flattened business district sprouted scores of buildings taller than you could find anywhere else in the world. Their unprecedented height earned these structures a new name: skyscraper.

illustration
Illustration by Mark Harris; images from The Reading Room / Alamy; Thomas Kelly / William Shaw / Library of Congress; Chicago Historical Society / Northwestern University

The Chicago fire enabled the rise of skyscrapers in three major ways. First, it made land available for new buildings. The fire may have destroyed the business district, but the railway system remained intact, creating ideal conditions for new construction. So much capital flowed into Chicago that downtown real-estate prices actually rose in the first 12 months after the fire. “The 1871 fire wiped out the rich business heart of the city, and so there was lots of money and motivation to rebuild immediately,” Julius L. Jones, an assistant curator at the Chicago History Museum, told me. “It might have been different if the fire had just wiped out poor areas and left the banks and business offices alone.” What’s more, he said, the city used the debris from the fire to extend the shoreline into Lake Michigan and create more land.

Derek Thompson: The workforce is about to change dramatically

Second, a combination of regulatory and technological developments changed what Chicago was made of. Insurance companies and city governments mandated fire-resistant construction. At first, Chicago rebuilt with brick, stone, iron. But over time, the urge to create a fireproof city in an environment of escalating real-estate prices pushed architects and builders to experiment with steel, a material made newly affordable by recent innovations. Steel-skeleton frames not only offered more protection from fire; they also supported more weight, allowing buildings to grow taller.

Third, and most important, post-fire reconstruction brought together a cluster of young architects who ultimately competed with one another to build higher and higher. In the simplest rendition of this story, the visionary architect William Le Baron Jenney masterminded the construction of what is considered history’s first skyscraper, the 138-foot-tall Home Insurance Building, which opened in 1885. But the skyscraper’s invention was a team effort, with Jenney serving as a kind of player-coach. In 1882, Jenney’s apprentice, Daniel Burnham, had collaborated with another architect, John Root, to design the 130-foot-tall Montauk Building, which was the first high steel building to open in Chicago. Another Jenney protégé, Louis Sullivan, along with Dankmar Adler, designed the 135-foot-tall Wainwright Building, the first skyscraper in St. Louis. Years later, Ayn Rand would base The Fountainhead on a fictionalized version of Sullivan and his protégé, Frank Lloyd Wright. It is a false narrative: “Sullivan and Wright are depicted as lone eagles, paragons of rugged individualism,” Edward Glaeser wrote in Triumph of the City. “They weren’t. They were great architects deeply enmeshed in an urban chain of innovation.”

It is impossible to know just how much cities everywhere have benefited from Chicago’s successful experiments in steel-skeleton construction. By enabling developers to add great amounts of floor space without needing additional ground area, the skyscraper has encouraged density. Finding ways to safely fit more people into cities has led to a faster pace of innovation, greater retail experimentation, and more opportunities for middle- and low-income families to live near business hubs. People in dense areas also own fewer cars and burn hundreds of gallons less gasoline each year than people in nonurban areas. Ecologically and economically, and in terms of equity and opportunity, the skyscraper, forged in the architectural milieu of post-fire Chicago, is one of the most triumphant inventions in urban history.

Taming the Steampunk Jungle

March 10, 1888, was a gorgeous Saturday in New York City. Walt Whitman, the staff poet at The New York Herald, used the weekend to mark the end of winter: “Forth from its sunny nook of shelter’d grass—innocent, golden, calm as the dawn / The spring’s first dandelion shows its trustful face.” On Saturday evening, the city’s meteorologist, known lovingly as the “weather prophet” to local newspapers, predicted more fair weather followed by a spot of rain. Then the weather prophet went home and took Sunday off.

Meanwhile, two storms converged. From the Gulf of Mexico, a shelf of dark clouds soaked with moisture crept north. And from the Great Lakes, a cold front that had already smothered Minnesota with snow rolled east. The fronts collided over New York City.

illustration
Illustration by Mark Harris; images from F Atkinson / Getty; C. H. Jordan / Library of Congress; NOAA

Residents awoke on Monday, the day Whitman’s poem was published, to the worst blizzard in U.S. history. By Thursday morning, the storm had dumped more than 50 inches of snow in parts of the Northeast. Snowdrifts were blown into formations 50 feet high. Food deliveries were suspended, and mothers ran short on milk. Hundreds died of exposure and starvation. Like the Lisbon earthquake more than a century before, the blizzard of 1888 was not just a natural disaster; it was also a psychological blow. The great machine of New York seized up and went silent. Its nascent electrical system failed. Industries stopped operating. “The elevated railways service broke down completely,” the New York Weekly Tribune reported on March 14:

The street cars were valueless; the suburban railways were blocked; telegraph communications were cut; the Exchanges did nothing; the Mayor didn’t visit his office; the city was left to run itself; chaos reigned.

The New York now buried under snow had been a steampunk jungle. Elevated trains clang-clanged through neighborhoods; along the streets, electrical wires looped and drooped from thousands of poles. Yet 20 years after the storm, the trains and wires had mostly vanished—at least so far as anyone aboveground could see. To protect its most important elements of infrastructure from the weather, New York realized, it had to put them underground.

First, New York buried the wires. In early 1889, telegraph, telephone, and utility companies were given 90 days to get rid of all their visible infrastructure. New York’s industrial forest of utility poles was cleared, allowing some residents to see the street outside their windows for the first time. Underground conduits proved cheaper to maintain, and they could fit more bandwidth, which ultimately meant more telephones and more electricity.

Second, and even more important, New York buried its elevated trains, creating the country’s most famous subway system. “An underground rapid transit system would have done what the elevated trains could not do,” The New York Times had written in the days after the blizzard, blasting “the inadequacy of the elevated railroad system to such an emergency.” Even without a blizzard, as Doug Most details in The Race Underground, New York’s streets were becoming impassable scrums of pedestrians, trolleys, horses, and carriages. The year before the blizzard, the elevated rails saw an increase of 13 million passengers. The need for some alternative—and likely subterranean—form of transportation was obvious. London had opened the first part of its subway system several decades earlier. In New York, the blizzard was the trigger.

“New York is built on disasters,” Mitchell L. Moss, a professor of urban policy and planning at NYU, told me recently. “There’s the 1835 fire, and the construction of the Croton Aqueduct. There’s the 1888 blizzard, and the construction of the subway. There’s the Triangle Shirtwaist fire, which killed 146 workers in Manhattan. Frances Perkins would say, ‘The New Deal started with the factory fire,’ because it was the disaster that led to a New York State commission on labor conditions, which in turn led to the eight-hour workday. In all of these physical disasters, New York City has responded by changing for the better.”

Read: How Frances Perkins, the first woman in the U.S. Cabinet, found her vocation

In October 1904, after years of political fights, contractor negotiations, and engineering challenges, New York’s first subway line opened. In a lightning-bolt shape, it ran north from city hall to Grand Central Station, hooked west along 42nd Street, and then turned north again at Times Square, running all the way to 145th Street and Broadway, in Harlem. Operated by the Interborough Rapid Transit Company, the 28-stop subway line was known as the IRT. Just months later, New York faced a crucial test: another massive winter storm. As the blizzard raged, the IRT superintendent reported “446,000 passengers transported,” a record daily high achieved “without a single mishap.”

illustration
Illustration by Mark Harris; images from Hulton Archive / Bettmann / Smith Collection / Gado / Getty; Interborough Rapid Transit Company

Finding Our Inner Chadwick

Not all calamities summon forth the better angels of our nature. A complete survey of urban disasters might show something closer to the opposite: “Status-quo bias” can prove more powerful than the need for urgent change. As U.S. manufacturing jobs declined in the latter half of the 20th century, cities like Detroit and Youngstown, Ohio, fell into disrepair, as leaders failed to anticipate what the transition to a postindustrial future would require. When business districts are destroyed—as in Chicago in 1871—an influx of capital may save the day. But when the urban victims are poor or minorities, post-crisis rebuilding can be slow, if it happens at all. Hurricane Katrina flooded New Orleans in 2005 and displaced countless low-income residents, many of whom never returned. Some cataclysms are not so much about bricks and mortar as they are about inequality and injustice. “Natural disasters on their own don’t do anything to stem injustice,” observes Keeanga-Yamahtta Taylor, a professor of African American studies at Princeton. “Without social movements or social upheaval, the recognition of inequities never progresses beyond an acknowledgment that ‘We have a long way to go.’ ”

Still, catastrophes can fix our minds on a common crisis, pull down political and regulatory barriers that stand in the way of progress, and spur technological leaps, bringing talent and money together to solve big problems. “Disasters reveal problems that already existed, and in doing so, create an opportunity to go back and do what you should have done the first time,” Mitchell Moss said. New York City didn’t have to suffer a devastating fire in 1835 to know that it needed a freshwater source. Nonetheless, when Lower Manhattan burned, city leaders were persuaded to act.

Normal times do not offer a convenient news peg for slow-rolling catastrophes. When we look at the world around us—at outdated or crumbling infrastructure, at inadequate health care, at racism and poverty—it is all too easy to cultivate an attitude of small-minded resignation: This is just the way it has always been.Calamity can stir us from the trance of complacency and force us to ask first-principle questions about the world: What is a community for? How is it put together? What are its basic needs? How should we provide them?

These are the questions we should be asking about our own world as we confront the coronavirus pandemic and think about what should come after. The most important changes following past catastrophes went beyond the catastrophe itself. They accounted fully for the problems that had been revealed, and conceived of solutions broadly. New York did not react to the blizzard of 1888 by stockpiling snow shovels. It created an entire infrastructure of subterranean power and transit that made the city cleaner, more equitable, and more efficient.

The response to COVID-19 could be similarly far-reaching. The greatest lesson of the outbreak may be that modern cities are inadequately designed to keep us safe, not only from coronaviruses, but from other forms of infectious disease and from environmental conditions, such as pollution (which contributes to illness) and overcrowding (which contributes to the spread of illness). What if we designed a city with a greater awareness of all threats to our health?

The responses could start with a guarantee of universal health care, whatever the specific mechanism. COVID-19 has shown that our survival is inextricably connected to the health of strangers. Because of unequal access to health care, among other reasons, many people—especially low-income and nonwhite Americans—have been disproportionately hard-hit by the pandemic. People with low incomes are more likely than others to live in multigenerational households, making pathways of transmission more varied. People with serious preexisting conditions have often lacked routine access to preventive care—and people with such conditions have experienced higher rates of mortality from COVID-19. When it comes to infectious diseases, a risk to anyone is a risk to everyone. Meanwhile, because of their size, density, and exposure to foreign travelers, cities initially bore the brunt of this pandemic. There is no reason to think the pattern will change. In an age of pandemics, universal health care is not just a safety net; it is a matter of national security.

City leaders could redesign cities to save lives in two ways. First, they could clamp down on automotive traffic. While that may seem far afield from the current pandemic, long-term exposure to pollution from cars and trucks causes more than 50,000 premature deaths a year in the United States, according to a 2013 study. Respiratory conditions aggravated by pollution can increase vulnerability to other illnesses, including infectious ones. The pandemic shutdowns have shown us what an alternative urban future might look like. Cities could remove most cars from downtown areas and give these streets back to the people. In the short term, this would serve our pandemic-fighting efforts by giving restaurants and bars more outdoor space. In the long term, it would transform cities for the better—adding significantly more room for walkers and bicycle lanes, and making the urban way of life more healthy and attractive.

Second, cities could fundamentally rethink the design and uses of modern buildings. Future pandemics caused by airborne viruses are inevitable—East Asia has had several this century, already—yet too many modern buildings achieve energy efficiency by sealing off outside air, thus creating the perfect petri dish for any disease that thrives in unventilated interiors. Local governments should update ventilation standards to make offices less dangerous. Further, as more Americans work remotely to avoid crowded trains and poorly ventilated offices, local governments should also encourage developers to turn vacant buildings into apartment complexes, through new zoning laws and tax credits. Converting empty offices into apartments would add more housing in rich cities with a shortage of affordable places to live, expand the tax base, and further reduce driving by letting more families make their homes downtown.

Altogether, this is a vision of a 21st-century city remade with public health in mind, achieving the neat trick of being both more populated and more capacious. An urban world with half as many cars would be a triumph. Indoor office and retail space would become less valuable, outdoor space would become more essential, and city streets would be reclaimed by the people.

“Right now, with COVID, we’re all putting our hopes in one thing—one cure, one vaccine—and it speaks to how narrow our vision of society has become,” says Rosner, the Columbia public-health historian. His hero, Chadwick, went further. He used an existential crisis to rewrite the rules of modern governance. He shaped our thinking about the state’s responsibility to the poor as much as he reshaped the modern city. We should hope that our response to the 2020 pandemic is Chadwickian in its capacity to help us see the preexisting injustices laid bare by this disease.

One day, when COVID-19 is a distant memory, a historian of urban catastrophe might observe, in reviewing the record, that human beings looked up, to the sky, after a fire; looked down, into the earth, after a blizzard; and at last looked around, at one another, after a plague.


This article appears in the October 2020 print edition with the headline “How Disaster Shaped the Modern City.”

* Illustration by Mark Harris; images from Interborough Rapid Transit Company; National Weather Service; Wiley & Putnam / Artokoloro / British Library / Alamy; Thomas Kelly / Library of Congress