Quote Of The Year

Timeless Quotes - Sadly The Late Paul Shetler - "Its not Your Health Record it's a Government Record Of Your Health Information"

or

H. L. Mencken - "For every complex problem there is an answer that is clear, simple, and wrong."

Sunday, May 25, 2025

It Is Hard To Believe That Donald Trump Is Not The Worst Thing To Happen To The US This Century…

This appeared last week:

Trump’s war on Harvard is un-American

The trigger-happy firing range that is the present administration has put America’s universities squarely in the crosshairs.

The Trump administration told Harvard on Friday it can no longer enrol international students. Bloomberg

Simon Schama

May 23, 2025 – 3.51pm

So many enemies: spineless judges, moaners about due process; fake news merchants; the Fed; Canadians (nasty); Europeans, same (except for Italy and Hungary); environmental hoaxers; regulators of shower pressure; cancer-causing windmills; tariff-haters; Venezuelans; the Cheneys. But the worst of the lot? Not even close. Professors! Radical left lunatics, or those soft on them, which is the same thing. Let’s see how they like it when the money tap turns off.

The trigger-happy firing range that is the Trump administration has put America’s universities squarely in the crosshairs. The more liberal the faculty, the heavier the hit: billions in federal grants stripped from Harvard, hundreds of millions from other Ivy Leaguers. [On Friday, the administration told Harvard it can no longer enroll international students.]

The purported reason for going full Mr Potter on America’s great universities is antisemitism. Has the harassment and abuse of Jewish students been a serious problem, especially since October 7? Yes. Have anti-Zionist chants crossed a line into outright Jew-hatred? Absolutely. Are colleges doing something about it? Yes; grade of B+.

But does kneecapping science departments by choking off their research funding persuade River-to-Sea chanters to pipe down? Hardly. Coming to the aid of campus Jews was always a pretext. Forgive us if we doubt that presenting the subjection of higher education’s independence to an ideological purge, labelled “defence of the Jews”, will work as an antidote to antisemitism.

Connoisseurs of oxymorons might enjoy the imposition, on pain of financial strangulation, of “viewpoint diversity” on colleges deemed to have undergone “ideological capture”.

But anyone doubting that the “existential terror” described by Christopher Rufo, the zealot of the campaign against universities, as his goal has been the main point all along need only look at the closing speech of the National Conservatism Conference in November 2021, delivered by JD Vance. He was then campaigning, bankrolled to the tune of $US15 million ($23 million) by his former boss Peter Thiel, for an Ohio Senate seat.

Among the Republican audience, there might have been some inconveniently recalling Vance’s withering attack on the political and moral credentials of one Donald Trump. What could be better, then, by way of demonstrating his true conversion, than descanting on America’s “fundamentally corrupt” universities, institutions so irredeemably rotten that Vance had concluded it was necessary to abandon one of the cherished truisms of the American dream: a four-year college education.

“Ladies and gentlemen,” he warned, “we are giving our children over to our enemies, and it’s time we stop doing it.” All that happened in the grove of academe, Vance went on, was that students would “learn to hate their country and acquire a lot of debt in the process”.

His peroration, to which, he said, he had given much thought, would feature, for his mic drop, a pearl of wisdom from “the great prophet and statesman” Richard Milhous Nixon.

Speaking in December 1972 to Henry Kissinger (the most professorial member of his cabinet and sometime member of the Harvard faculty), Nixon had mused that “the professors are the enemy” – words Vance had clearly taken to heart. Their evil twin was, of course, the press. But Nixon returned to his mantra. “The professors are the enemy. Write that on a blackboard 100 times and never forget it.”

Which, evidently, Vance has not. But pinning the ills of America on a free press and a college education would have surprised the Founding Fathers, whose Declaration of Independence Trump will be commemorating next year, its 250th anniversary.

“Washington’s first address to Congress declared that ‘knowledge is in every country the surest basis of public happiness’.”

As the Founders saw it, the great driver of freedom was knowledge. Two decades before independence, the lawyer and essayist William Livingston insisted in a journal called The Independent Reflector that “knowledge among a people makes them free, enterprising and dauntless; but ignorance enslaves, emasculates and depresses them”.

Whatever difference arose between Washington, John Adams and Jefferson following independence, it was a shared truism of the governing class that the very existence of the US as a free republic was conditional on a well-informed citizenry.

Washington, whose first annual address to Congress in 1790 declared that “knowledge is in every country the surest basis of public happiness”, envisioned a national university in the capital that would rise above party factions in the ennobling pursuit of truth; though neither college nor partisan peace would be realised during his lifetime.

In 1779, Thomas Jefferson (who would make sure that his role as “Father of the University of Virginia” would be inscribed on his tombstone) championed a Bill for the More General Diffusion of Knowledge. Its purpose would be to “illuminate ... the minds of the people at large” – excluding, of course, women and the enslaved – “and more especially to give them knowledge of those facts ... [that] they may be enabled to know ambition under all its shapes”.

The 1780 Constitution of the Commonwealth of Massachusetts, primarily drafted by Adams, committed itself to “The Encouragement of Literature” so that “Harvard-College in Cambridge” would be the institution through which the diffusion of “wisdom and knowledge” ensured the health of the body politic.

As Richard D Brown’s important history The Strength of a People: The Idea of an Informed Citizenry in America, 1650-1870, points out, all of America’s first four presidents (including James Madison) assumed that the security of the republic depended on the “equation of virtue and knowledge”.

A century later, Calvin Coolidge might assert that “the chief business of the American people is business”, but a rich stream of ideas flowing from the learned optimism of the Founders, through the creation of land-grant colleges and the “brain trust” administrations of Franklin Roosevelt, assumed that professors were not the “enemy” but a resource that was indispensable for the good of the nation. The true enemy of American democracy was not professors, but ignorance.


This was by no means a universal view. For all his pride in the University of Virginia, Jefferson, who dedicated himself to a “crusade against ignorance”, lamented all the baseless slanders that came his way in the cacophony of politics. “So many falsehoods have been propagated,” he wrote, “that nothing is now believed and ... for want of intelligence they may rely on, [the people] are become lethargic and insensible.”

It would not be the last time that the defenders of empirically confirmed truth would find themselves on the back foot. One of the great books of American history, the Columbia history professor Richard Hofstadter’s Anti-Intellectualism in American Life, published in 1963, shortly after the Red Scare, chronicles the populist equation of highbrow with un-American. Hofstadter warns that, however tempting, the denigration of intellect ought not to be reduced to “eggheads and fatheads”.

For all the high-minded nostrums of the Founders, America’s sense of its calling in the world was at least as much shaped by Christian evangelism as Enlightenment reasoning. The sovereignty of the feeling heart would have its way over the reflecting mind.

Ralph Waldo Emerson could inspire the Phi Beta Kappa class at Harvard in 1837 by holding up “the true scholar” as “the only true master” who would “resist the vulgar prosperity that retrogrades ever to barbarism”.

But beyond Harvard Yard, multitudes would heed Billy Sunday, the early 20th-century revivalist preacher, when he warned that “thousands of college graduates are going as fast as they can straight to hell. If I had a million dollars I’d give $999,999 to the church and $1 to education ... When the word of God says one thing and scholarship says another, scholarship can go to hell.”

“The demand for lockstep obedience is exactly why autocracies of knowledge always end up damaged by their intellectual self-harm.”

As the US flexed its military muscle and flowered economically, two more foes of excessive cerebration joined the fray. When, in 1828, Andrew Jackson soundly defeated the incumbent president John Quincy Adams, the son of the second president and himself a passionate believer in the federal government’s role in creating and funding scientific institutions, the Jacksonites attributed their victory to their hero being a man of action rather than a man of learning.

The choice, they said, was between “John Quincy Adams, who can write” and “Andrew Jackson, who can fight”.

Half a century later, in the gilded age of the robber barons, the enfeebling intellectual, all brain and no backbone, alienated from the instinctual life of regular folk, ignorant of practical business, and milquetoast in their patriotism, became a dependable attack line.

The great exception to being classified one way or the other was Theodore Roosevelt, overlooked by Donald Trump in favour of his peculiar fixation with William McKinley. But then Roosevelt saw his trust busting as a natural projection of the rough-riding man of action.

Brain trusts came into their own again when Teddy’s distant cousin Franklin recruited two Columbia academics – the economist Rexford Tugwell and Raymond Moley, a professor of law, to the White House. Their influence on presidential decision-making was pounced on by the Republican foes of the New Deal as another example of out-of-touch professors imposing alien socialism on the American people. While FDR was contemptuous of the caricature, it worked well enough to push Tugwell out of government in 1936.

But even before Pearl Harbour, the need for scientific knowhow in fighting a likely war brought the professors back to the White House. In June 1941, and in response to a proposal by the MIT engineer Vannevar Bush, Roosevelt established the Office of Scientific Research and Development. Bush was its head, reporting directly to the president.

The results of its work – mass production of penicillin for battlefield wounded, proximity fuses that transformed anti-aircraft fire, and, not least, the Manhattan Project – made an unarguable case for the partnership between government and university-based research science.

Though given only a minor role in Christopher Nolan’s Oppenheimer, Bush was famous enough to feature on the cover of Time magazine. Largely forgotten now beyond histories of science, he was one of the 20th century’s most remarkable visionaries, not least for his conviction that peacetime federal governments had an obligation to fund basic scientific research, liberated from the demands of commercial profit.

In the summer of 1945, Bush wrote two essays, both of which pointed to the future. The shorter piece, As We May Think, published in The Atlantic Monthly, was devoted to his invention, the “memex” (short for “memory expansion”): a machine that would transform the capture of information by storing an infinity of microfilmed documents while providing “associative trails” that foreshadowed, albeit in analogue, the hyperlinks of the world wide web.

The longer essay, Science, the Endless Frontier, was in effect a response to FDR, who in November 1944 had written to Bush that “new frontiers of the mind are before us” and asked him to think about how the momentum of wartime breakthroughs could be sustained in peacetime, in particular “the war of science against disease” and the “discovering and developing scientific talent in American youth”.

Bush argued that since colleges were “the wellsprings of knowledge and understanding”, they should be parties to research contracts with the government that would provide the necessary stability of funding for sustained experimental work. This would guarantee the “free play of free intellects working on subjects of their own choice, in the manner dictated by their curiosity for exploration of the unknown”.

The National Science Foundation, created by Congress and signed into law by President Harry S Truman in May 1950, owed much to Bush’s eloquence and vision – though its governance was not what he wanted. Instead of a director appointed by a board dominated by scientists, the head of the agency would be picked by the president. Nonetheless, Bush’s ambition to bring science from the wings “to the centre of the stage” had been achieved and, in the decades that followed, became spectacularly fruitful in world-changing breakthroughs and Nobel prizes.


It is this partnership of knowledge that is currently suffering brutal collateral damage from MAGA’s culture wars and the chainsaw massacre of expertise enacted by DOGE. The continuity of funding that Bush saw as a condition of experimental freedom has been smashed. The National Institutes of Health has already lost 1200 of its staff, with threats of many more layoffs. Good Friday was not so good for the more than 400 recipients of grants from the NSF who had their funding cancelled.

Tellingly, research projects dealing with disinformation, climate science or anything attempting to advance science in under-represented groups have been singled out for punishment. The cuts have been partly based on a Senate report last October in which, among other conclusions, the term “biodiversity” was misinterpreted to imply deference to the now taboo DEI.

The crudeness of these exercises in political conformity is exemplified by the freezing of invaluable peer-reviewed journals such as Emerging Infectious Diseases, CHEST, specialising in asthma and pulmonary disease research, and the Morbidity and Mortality Weekly Report.

A letter sent by Ed Martin jnr, the interim attorney-general for DC, to the New England Journal of Medicine, demands answers to six questions, satisfying the authorities that “alternative views” are accommodated in their pages. But this is, in effect, DEI for Robert Kennedy jnr’s dubious version of science, ignoring the strict peer-reviewed standards to which all journals adhere.

The demand for lockstep obedience to the party line is the purest Sovietism and it is exactly why autocracies of knowledge always end up damaged by their intellectual self-harm.

Science is not the only casualty of the war on knowledge. President Trump has let it be known that he wants no “negativity” in the Smithsonian Institution’s historical museums. History must now be mobilised in the service of national self-congratulation while the tanks roll down the Mall on the military parade the president is orchestrating for his 79th birthday treat.

But that is not what my trade’s founders had in mind at all. And one of them, a military man, Thucydides, wrote his History of the Peloponnesian Wars as an exercise in Athenian self-criticism, building as he does to the hubris-heavy catastrophe of the expedition to Syracuse.

In doing so, he laid down the rules of our professional code of practice. History is neither an exercise in vain self-glorification, nor is it penitential polemic; rather, and most simply, the retrieval of evidence in pursuit of the truth.

But though the Founders would all have read the Greeks, it’s a reasonable bet that the 47th president has passed them by. So instead of reflection on the significance of 1776, we will be getting a National Garden of American Heroes, some 250 statues that are by definition an entirely dumb personification of history.

Just this month the National Endowments for the Humanities and for the Arts have both been informed that 85 per cent of their grants have been cancelled and that funds supporting countless projects of research and artistic expression across America would be diverted to the garden to meet the bill, reportedly coming in at between $100,000 and $200,000 per statue.

Among Trump’s original pick list, there is one unlikely hero (at least for the president). Alphabetically sandwiched between Susan B Anthony and Louis Armstrong is Hannah Arendt, historian, philosopher and author of, among many other things, a powerful essay on Truth and Politics.

You must hope that her statue will feature the obligatory cigarette together with an ironic smile, knowing that she provides a plinth text that Donald Trump is bound to appreciate.

“Truth, though powerless and always defeated in a head-on clash with the powers that be, possesses a strength of its own: whatever those in power may contrive, they are unable to discover or invent a viable substitute for it. Persuasion and violence can destroy truth, but they cannot replace it.”

Here is the link:

https://www.afr.com/world/north-america/trump-s-war-on-harvard-is-un-american-20250523-p5m1on

All I can say is that I find it really  terrifying to have the power of the US in the hands of such an ill-suited individual. He really is a global menace!

David.

AusHealthIT Poll Number 795 – Results – 25 May 2025.

Here are the results of the recent poll.

Do You See Any Value In Having A Blood Test To Diagnose Alzheimer's Disease Available?

Yes                                                                      8 (47%)

No                                                                       9 (53%)

I Have No Idea                                                   0 (0%)

Total No. Of Votes: 17

Clearly a pretty much split vote.

Any insights on the poll are welcome, as a comment, as usual!

Pathetic voter turnout – answer must have been too easy. 

0 of 17 who answered the poll admitted to not being sure about the answer to the question!

Again, many, many thanks to all those who voted! 

David.

Friday, May 23, 2025

Technology Throws Up A Really Difficult Dilemma! To Track Or Not!

This appeared last week:

Opinion

When does tracking become stalking? Tell your kids. There’s one key warning sign

Julia Baird

Journalist, broadcaster, historian and author

May 16, 2025 — 7.30pm

“Don’t you ever want to, like, disappear?” I asked my 18-year-old daughter this week.

She looked at me, puzzled: “What do you mean?”

“Like drop off the map, not be tracked, have no one know where you are.” I was thinking of my own backpacking days, full of narrow escapes and all kinds of peril but so free. I’d call home collect, I wasn’t a blinking dot on anyone’s map, I just disappeared inside the belly of Europe. Dyed my hair red, did stupid things, danced in subterranean clubs, slept in boatsheds and on beaches, rode motorbikes in Greece, crashed motorbikes in Greece, all that stuff.

“No,” she shrugged.

We had been talking about the new research showing how young people are used to being surveilled. The theory is that it starts with their parents (hands up, all of you on Find My ... or Life360), then runs on to relationships, which can leave them vulnerable to controlling partners.

eSafety Commissioner Julie Inman Grant believes that tracking by well-intentioned parents wanting to keep kids safe has “anaesthetised young people to this whole idea of being monitored or surveilled”. Her research shows double the number of men (one in five) than women (one in 10) think constantly texting to find out what their partner is doing, and with whom, is a sign of care.

Griffith University PhD student Maria Atienzar-Prieto told the Herald this kind of coercive control has been normalised because “it starts in the family home”.

Every kid – and adult – needs to understand what coercive control looks like, how deceptive or dangerous it can be, and what they must do to avoid or leave those relationships.

But young people live in a world of surveillance now – they know where their mates are, often their parents, almost always their boyfriends or girlfriends. So, when we warn them, again, about the perils of the technology we use ourselves, we are in danger of inviting a lecture on parental hypocrisy.

Young people often understand the technology better than we do – how to deny access once you have broken up with someone; freeze or pause your location; hide away for a bit. (This, of course, can be a red flag in a relationship – is your partner cheating or just out of Wi-Fi range?)

The most important thing to teach kids is how to identify when affection – “Where are you? Did you get home OK? Are you hurt?” – turns to creeping control – “Where are you now? Why didn’t you return my calls? Why are you home so late? Who were you with?” on a loop.

Tracking kids can be reassuring if they are out late or travelling alone. But surely if we are to address how to protect kids from abusive lovers who follow them across digital maps, then we need to ask some hard questions of ourselves about when and how we need to track them too.

I only relatively recently linked up with my kids, now 16 and 18, on a tracking app. I’ve tried to be careful in using it for protection, not nosiness.

This is partly because of one cautionary Black Mirror episode – Arkangel – that has stuck with me for a long time. In it, a three-year-old girl, Sara, goes missing, and her mother Marie is terrified. After finding her, Marie, a single mother, decides to enrol Sara in a trial where a chip is planted in her brain. This technology allows Marie to see through her daughter’s eyes and hear through her ears, wherever she is, also providing location details. It was also designed to censor any stressful sights, preventing Sara from seeing anything that might elevate her pulse rate (a bit dangerous in case of emergency?).

The two clash over the use of the technology as Sara grows older, then it becomes intrusive – Marie eventually sees her daughter snorting cocaine and having sex with her drug-dealing boyfriend and freaks out. She begins to intervene, finding the boyfriend and telling him to back off, hiding contraceptives in smoothies. The episode ends with Sara hitchhiking out of town.

In some ways, the story is crudely told, but the idea that there are obvious dangers to too closely monitoring (thereby controlling) your teenage children has stayed with me. It was directed by Jodie Foster and written by Charlie Brooker, who said becoming a father made him sympathetic to helicopter parenting.

Executive producer Annabel Jones said it was based on chip implants pets have, telling Variety: “I have heard that there are some children that are getting them now, so this is just an exaggeration of that. We wanted to find a really good idea of how that could go terribly wrong.” Is the message then, “if you love it, let it go?” Does a child have a right to privacy, or does a parent’s need for reassurance on safety trump it in all circumstances? And do we all agree on what safe is? Who assesses vulnerability?

It’s difficult for parents to stem the tsunami of images, videos and other content that kids can see when they are still too little to understand; porn, violence, predatory invitations, in a world where sexual assault is obscenely common. It’s understandable to feel inadequate and fearful.

But a lucrative industry has grown up to cater to, to fuel, both parental anxiety and stalkers in the eight years since Arkangel aired – wearable tracking devices, spyware for remote reading and controlling of phones, smartwatches, even clothing with trackers. AirTags, smart tags, tiles.

It’s the secretive stuff that worries me most. If someone is being harassed by an ex or acquaintance, the phone is one mechanism they can actually control – unless spyware is involved.

You can be tracked with hidden patches or pockets ironed or stuck on to clothes or items. One girl found a tracker sewn into overalls she ordered from Shein. A variety of trackers can be attached to cars – under number plates, in seatbelt buckles, in fuel caps.

Yes, educate your kids about phones. And let them educate you. But remember, the technology is a means, a mechanism, which will shift in form over time.

The problem, ultimately, is the controlling behaviour.

Julia Baird is a journalist, author and regular columnist.

Here is the link:

https://www.smh.com.au/technology/when-does-tracking-become-stalking-tell-your-kids-there-s-one-key-warning-sign-20250516-p5lzpk.html

Here is a 21st Century problem if ever I saw one. Of course you want to know where the kids are and that they are alright! But can to track them to ease your anxiety and do you tell them you are, or not!

This is way above my pay grade and I have no idea what the right answer is and for what circumstances!

Interested in views from readers….

David.

Thursday, May 22, 2025

Sometimes Simple And Cheap Is The Best! We Should Have Guessed This Outcome In Advance…

This appeared last week:

$10m was spent on these melanoma scanners. Doctors were better at detecting cancer

By Liam Mannix

May 17, 2025 — 4.31pm

A huge and much-hyped government investment into 3D skin cancer scanners has hit an unexpected snag after early data showed the scanners performed no better than a simple skin check from a GP – and may lead to overdiagnosis.

The new data has stunned researchers, who are debating whether this represents a blip that will be ironed out as the tech improves or a cautionary tale about the promise and perils of shiny new medical technology.

The Australian Cancer Research Foundation spent about $10 million in 2018 to set up 15 3D full-body cameras across Australia. The Queensland-based research centre established to run the network received another $25 million in federal government research funding, as well as funding from the camera’s manufacturer.

The scanners, each of which cost about $500,000, use dozens of cameras to generate a 3D image of a person, tracking the location of each mole and blemish.

When the first machines were installed in Australia in 2017 as part of a separate project, a glowing press release said the tech would “revolutionise melanoma detection”.

That revolution is not yet here.

In a study published earlier this year in JAMA Dermatology, researchers found adding the cameras to usual care led to a lot more lesions being cut out from volunteers’ skin – but no more melanoma being detected compared to standard skin checks. And the scanners added $945 per patient in healthcare costs.

“This study is like a cautionary tale,” said one leading melanoma researcher, working on a related project and granted anonymity to speak freely about the trial. “These are very costly devices. And they might not work if you don’t implement it properly. And you’re just wasting lots of money and potentially doing harm.”

“It does present a challenge for us going forward,” said Professor David Whiteman, a researcher at QIMR Berghofer and co-author of the study. “It does temper the enthusiasm a little for just how we go about dealing with skin cancer and its detection in Australia.”

Others disagree. Professor H. Peter Soyer heads the Australian Centre of Excellence in Melanoma Imaging & Diagnosis and led the study. “I still think our original vision, 3D total body imaging supported by AI … will basically lead to an improvement,” he said. “I have no doubt about it.”

Melanoma is Australia’s national cancer. We have the highest incidence rate in the world. More than 18,000 cases are diagnosed every year, and more than 1300 people die.

The cancer effects melanocytes, the pigment-producing cells responsible for skin’s colour, and occurs mainly in people with fair skin.

UV radiation from the sun is capable of directly altering the DNA code within these cells. Damage the code in genes crucial to controlling growth, and the cell can enter a frenzy of uncontrolled growth. If the new tumour grows deep enough, it can access the bloodstream and spread to our organs or brain.

That makes it a highly survivable cancer if it is detected early and cut out. Five-year survival rates for melanomas detected at stage 1 – when the cancer remains a single skin spot – are greater than 99 per cent.

If the cancer is detected at stage 2 – the cancer is at least a millimetre thick – that rate drops to 73 per cent. By stage 4, five-year survival rates plummet to 26 per cent.

Hence the interest in a screening program, which could theoretically pick up melanomas before they have a chance to spread. Late last year, the federal government committed $10.3 million to develop a road map.

But screening is harder than it would first appear. Diagnosis is somewhat subjective. Your skin is like a tapestry, painted with scars and freckles and moles – scientists call them naevi – that are not cancerous. And melanomas do not all look the same; there can even be disagreement among experts about which marks are benign and which cancerous. The accuracy of a visual test from a doctor ranges between 40 and 70 per cent.

“We are confronted with millions, really, of lesions on the skin that have to be assessed. And it’s very, very difficult for the practitioners to discriminate which are the nasty ones because there are no rewards for getting it wrong,” said Whiteman.

An effort to screen hundreds of thousands of Germans for melanoma led to no long-term reduction in cancer mortality; other screening studies have come to similar conclusions.

And then there’s the overdiagnosis problem. Melanoma was once a rare tumour but is now the third-most-commonly diagnosed in the US – an increase some refer to as an epidemic – yet there has been no actual increase in melanoma deaths.

This suggests, to some, we are cutting out way too many ‘could-be’ melanomas. “We are living in a fee-for-service society. If in doubt, cut it out,” said Soyer.

This is why the Australasian College of Dermatologists does not recommend melanoma screening.

“It is a waste of time. It’s not cost-effective. You stir up the worried well. They’ll have more procedures, so potentially there’s going to be more harm,” said incoming college president Dr Adrian Lim.

All of which brings us back to the 3D scanners.

The melanomas we really want to spot, and quickly, are the ones that are growing. What if you could quickly map out every spot on someone’s skin, and compare them, year-on-year, to spot malignancy? That’s the promise of Canfield Scientific’s VECTRA 360 system.

The patient steps inside the imager, where 92 cameras snap photos of every inch of exposed skin. A computer knits them together into a digital avatar, with each spot able to be analysed by a dermatologist – or an AI. And you can do it quickly, important if you’re going to screen millions of people.

In 2018, the Australian Cancer Research Foundation handed Soyer and his team $9.9 million to roll out the scanners – the only 3D cameras approved by Australia’s health regulator. “This is a significant game changer,” Soyer said in a 2021 video uploaded to Canfield’s YouTube channel. “This will allow us to detect your potential melanoma much, much earlier.”

Soyer has received consulting payments from Canfield Scientific since 2018, but the company was not involved in design or review of the study. The arrangement was “declared in all presentations and publications”, he said.

With all that promise, why are the early results so disappointing? Theories abound.

First, the study compared a 3D camera plus standard skin testing to just standard skin testing, so the intervention group was very heavily scrutinised. That could explain why so many extra moles were excised.

Why did the cameras not pick up more melanomas? Maybe because current skin checks work very well. “We have a very high bar,” said Lim.

The study also did not use the ability of the scanners to show a change in skin spots over time, a crucial melanoma symptom. “I do think that’s a key factor,” said Professor Anne Cust, who leads development of the skin cancer screening road map for Melanoma Institute Australia. And the machines may improve if AI is used to scan the data.

Soyer argues the results actually validated the technology because they “demonstrated that 3D imaging could identify skin lesions that should be reviewed by a dermatologist or clinician for appropriate treatment and diagnosis.”

But it may simply be the case machines are no better than a trained doctor. “That’s our default position – that this is the challenge we’re going to really struggle to overcome,” Whiteman said. “The computer has got to do better than that. And, at the moment, it does not seem like it can.”

Here is the link:

https://www.smh.com.au/national/10m-was-spent-on-these-melanoma-scanners-doctors-were-better-at-detecting-cancer-20250516-p5lzue.html

What a fascinating outcome from a supposedly high-tech study! I am sure in time we will work out a way to automate screening, given how important it is to both screen and screen accurately!

Watch this space I guess!

David.

Wednesday, May 21, 2025

There Is Only A Small Chance Administrators and Clinicians Can Really See Each-Other’s Perspective!

This appeared a few days ago!

Put patients first: let’s fix the conflict between hospital administrators and doctors

Patient safety should never come second to cultural problems in the health care sector.

Steve Robson

Hospital administrators and doctors must co-operate better in the interests of patient safety.

12:00 AM May 17, 2025

A series of stories in The Australian have revealed, in shocking detail, how widespread dysfunction in relationships between public hospital administrators and specialist doctors is putting patients’ lives at risk around the country.

Workplace conflict is rarely productive, but in our health system it has the potential to be catastrophic.

Our public hospitals can be dangerous places. There should be no excuses for accepting risk to patients as a consequence of cultural problems between healthcare workers and those who administer our hospitals.

I was given the responsibility of analysing the recent national doctors’ survey, undertaken by the Australian Salaried Medical Officers’ Federation – ASMOF. The findings of the survey were deeply concerning.

Only one quarter of responding doctors described their relationship with hospital administrators as respectful. More than two thirds of those doctors felt that health bureaucrats had little or no understanding of the clinical work of frontline doctors.

Perhaps most worrying of all, a staggering 75 per cent of hospitals doctors reported that they felt uncomfortable reporting safety concerns due to fear of retribution.

More than half of the public hospital specialists in the survey reported being aware of colleagues who had suffered retribution after raising concerns with management.

Australia’s public hospitals have never been under greater pressure. Already challenged in meeting demand before Covid-19, the post-pandemic landscape has left record waiting lists for surgery and other medical procedures, overwhelmed public hospital specialist clinics, and swamped emergency departments.

Emergency departments around the country are being swamped.

There is no prospect of demand on our public hospitals reducing any time soon. Indeed, with a deluge of chronic conditions such as diabetes and mental health problems and challenges in securing GP appointments, our hospitals will face only greater and greater demand.

The only way we can ensure that Australians continue to have access to a world-class health system is with our public hospitals working at maximum efficiency and with a top-class healthcare workforce. There is no plan B for millions of Australians.

Workplace safety and high-performing healthcare staff are not luxuries. The Australian Commission on Safety and Quality in Healthcare estimates that more than 10 per cent of all activity in public hospitals is the result of mistakes and adverse events. That represents billions of dollars wasted from an already cash-strapped hospital system.

If we are to minimise the risk of harm and medical mistakes, maximise the efficiency of our health system – and protect Australians – then improving the relationship between those who run our hospitals and those who provide the clinical care is not optional.

Doctors and fellow healthcare workers must feel safe in reporting safety and other concerns to hospital management.

Australians want to trust the care they receive in our public hospitals. They also want to have surgery in a timely manner, specialist clinic appointments before their conditions deteriorate, the best emergency department experience possible, and safe care when they do end up admitted to a hospital.

For these things to happen it is critical the health workforce is functioning at a peak, not burnt-out, frazzled and working in a hostile environment. Safety must be first and foremost and no doctor, nurse, or other hospital worker should be fearful of raising concerns for fear of reprisal.

Doctors and hospital workers should be able to raise concerns about process without fear of reprisal.

Ongoing negotiations of the National Health Reform Agreement offer the perfect vehicle to address these issues. Incentives to smooth out relationships between managers and healthcare workers should be baked into the final agreement. The Federal Health Minister should expect – indeed, demand – proof from state and territory counterparts that dysfunctional relationships are repaired. Australians expect no less.

Spending on health is the single biggest item on every state and territory health budget, and hospitals are the largest cost. With so many demands on the public purse, Australians have a right to expect that the health workforce is functioning at the highest level possible.

Righting the ship so that often-toxic relationships between hospital administrators and senior doctors are fixed must be a high priority. Every dollar spent on our public hospitals should yield the maximum benefit for Australian patients. Patient safety should be our prime goal.

Dysfunctional hospital workplaces put everyone at risk and are a drag on our economy at the worst possible time.

Steve Robson is professor of obstetrics and gynaecology at the Australian National University and former president of the Australian Medical Association. He is a board member of the National Health and Medical Research Council and a co-author of research into outcomes of public and private maternity care.

Here is the link:

https://www.theaustralian.com.au/health/medical/put-patients-first-lets-fix-the-conflict-between-hospital-administrators-and-doctors/news-story/b0047f3c769ef9d7e1f49d0906332216

What an amazingly naïve comment!

The clinicians and the administrators have fundamentally different drivers, interests and KPIs!

I am lucky enough to have been on both sides of these arguments and it really is a matter of perspective and motivations. Good clinicians know that they need good administrators to support them and administrators have no purpose without great clinicians to support.

The bottom line is that if both groups do their jobs well pain is minimized and success pretty much guaranteed! Basically it is a false and pretty silly dichotomy!

The class acts on both sides of the fence understand the game and just get on with their jobs!

Friday evening drinks can be a good way to sort most irritants out I have noticed! – but sadly it can be problematic with people needing to drive home! I am still not sure how to manage that issue! (Have partners come and pick people up at end of day?)

David.

Tuesday, May 20, 2025

It Looks Like Real Mega-Forces Are Moving To Really Change Our Lives!

This appeared last week:

I suspect we “ain’t seen nothing yet”!.

AI is starting to work. The Trump drama could look like a sideshow

Lost among the Trump turmoil is the disruption caused by the AI revolution. It’s happening and Australian investors, politicians and business leaders are not ready.

James Thomson Columnist

May 16, 2025 – 9.53am

For the past few days, some of Australia’s top chief executives – including Commonwealth Bank’s Matt Comyn, NAB’s Andrew Irvine and Telstra’s Vicki Brady – have been bunkered down in the US city of Seattle, for one of Microsoft’s most exclusive and influential events.

The tech giant’s annual CEO summit has an exclusive guest list that includes many leaders from America’s Fortune 500 companies. Comyn, who has spent the last two weeks touring the US, and nerds out on the detail of technology like few other Australian CEOs, says the Microsoft conference has become bigger each year, as the artificial intelligence revolution gathers pace.

“You see the sharp edge of the metaphorical sphere in the US, and how driven and how focused and how intensely they’re working on some of the broader technology challenges,” Comyn tells AFR Weekend.

Comyn, who has done stints in Silicon Valley and at CBA’s Seattle tech hub over the past two weeks on his US tour, has already taken away some key AI lessons from his trip. He says the past six months have seen tremendous advances in what’s called reinforcement learning, where AI mimics the trial-and-error learning process that humans use to achieve their goals. He’s also been closely studying the cultural and leadership aspects of AI implementation.

“There’s a big difference between big companies that are navigating that transition well and successfully, which is not necessarily easy,” Comyn says.

But his biggest message is that the AI revolution is moving faster than ever – and Australia may not be ready.

For many consumers, it may seem that the initial hype that accompanied the release of ChatGPT has faded, and generative AI models are simply better versions of existing tools – a smarter way to search the web, for example, or a souped virtual assistant.

But inside some of the world’s big businesses, things are changing, and fast.

“I think there are interesting questions about how and where that evolves, and how well equipped Australia is,” Comyn says. “The disruptive potential over a three- to five-year timeframe is significant, and there’s a lot of preparatory work and policy work and thought that needs to go into that, at an economy and policy and regulatory level.”

There’s been plenty of discussion about issues such as the infrastructure Australia needs to ride the AI wave, and what regulatory guidelines should steer the development of the sector. But if AI delivers on its promise – that is, if it can augment and replace human workers as tech giants like Microsoft expect – then much more complex and difficult questions will need to be addressed, including around vexed issues like welfare and taxation.

The disruption unleashed by AI could even make the febrile debate around Labor’s proposed tax on unrealised gains for savers with more than $3 million in superannuation look like a sideshow.

One of the most notable aspects of the powerful sharemarket rally that has greeted the cooling of US President Donald Trump’s trade war is the resurgence of America’s tech giants. Having led a two-year rally on Wall Street, the likes of Amazon, Microsoft, Meta Platforms and chipmaker Nvidia were hit first by the emergence of Chinese AI start-up DeepSeek, and then by Trump’s tariff war.

But it’s been a very different story this week. The pause in the tariff war between the US and China sent tech stocks surging, before Trump’s visit to Saudi Arabia, and the promise of huge spending by the Saudi Kingdom on AI infrastructure, added further momentum to the melt-up. Since the start of this week, Nvidia stock has surged 16 per cent, taking its gains since Wall Street bottomed on April 8 to more than 40 per cent.

But the evidence that AI is starting to change the global economy goes well beyond financial markets.

“AI is coming for your jobs. Heck, it’s coming for my job too. This is a wake-up call.”

— Fiverr, chief executive Micha Kaufman

Earlier this week tech giant Microsoft announced it would cut 3 per cent of its workforce – about 6000 workers – in what has been described as a “delayering” exercise designed to remove management levels.

But according to data uncovered by Bloomberg, about 40 per cent of the 2000 staff laid off in Microsoft’s home state of Washington were software engineers. That’s not surprising; last month, Microsoft chief executive Satya Nadella said that about 30 per cent of the code written inside the company was being written by AI.

Presumably, Microsoft’s own popular AI coding tool, called GitHub CoPilot, is doing a lot of the heavy lifting inside the tech giant – and doing the company’s own staff out of work.

The Microsoft sackings were a concrete example of a shift in mood that appears to be taking place across the US business community.

An article in The Wall Street Journal this week suggested that the war for talent has become a war on talent, with CEOs telling staff they need to work harder and stop complaining.

The paper recalled JPMorgan chief executive Jamie Dimon’s now-infamous leaked tirade against remote work from an internal meeting heard earlier this year – “I’ve had it with this kind of stuff. I’ve been working seven days a week since COVID, and I come in, and where is everybody else?” – and a brutal comment from Emma Grede, co-founder of Kim Kardashian’s shapewear company Skims and chief executive of clothing label Good American, co-founded by Khloe Kardashian.

“Work-life balance is your problem. It isn’t the employers’ responsibility,” she said on a podcast this month.

What’s led to this change in tone? At least in part, the Journal argued, a reflection of the advances in AI.

A changing workforce

At Shopify, the $225 billion e-commerce giant, chief executive Tobi Lütke recently wrote a company-wide memo instructing the managers not to hire new staff before making sure their roles could not be replicated by AI.

“Before asking for more headcount and resources, teams must demonstrate why they cannot get what they want done using AI,” he wrote. “What would this area look like if autonomous AI agents were already part of the team?”

At Salesforce, chief executive Marc Benioff has said the company will reduce its hiring of engineers this year due to the use of AI.

At US freelance marketplace Fiverr, chief executive Micha Kaufman shared on social media a memo sent to staff last month warning them that unless they become “an exceptional talent at what you do … you will face the need for a career change in a matter of months”.

“It does not matter if you are programmer, designer, product manager, data scientist, lawyer, customer support rep, salesperson or finance person – AI is coming for you,” Kaufman wrote. “AI is coming for your jobs. Heck, it’s coming for my job too. This is a wake-up call.”

All of this is, of course, anecdotal evidence, and it’s easy to get gloomy about the potential for AI to wipe out large numbers of jobs.

Raphael Arndt, chief executive of the Future Fund, points out that unemployment is hovering near historical lows right around the world, and AI can play an important role in meeting worker shortages in areas like healthcare, where humanoid robots could eventually prove invaluable in meeting the needs of an ageing population.

“We’ve been thinking about lower-cost workforce options for decades.”

— Craig Scroggie, chief executive NextDC

The chief executive of $28 billion ASX-listed accounting software giant Xero, tech veteran Sukhinder Singh Cassidy, is a believer in the disruptive power of AI. “It’s not the future – it’s here now,” she said on Thursday, after delivering another impressive earnings result.

But adoption can be slower than expected; when Singh Cassidy was an executive at Google, she remembers being told by co-founder Eric Schmidt back in 2003 that cloud computing would take over the world. More than two decades later, she says Xero is still working hard to bring customers into the cloud.

But Craig Scroggie, chief executive of ASX-listed data centre operator NextDC says the trend is clear – companies always find ways to cut costs and maximise profits.

“We’ve been thinking about lower-cost workforce options for decades. We’ve outsourced to lower-cost countries for decades, but now we have a tech-based knowledge base with the ability to put a workforce’s entire knowledge base in one system,” Scroggie said at the Macquarie Australia conference last week.

Like it or not, he’s right. The sheer weight of money being ploughed into generative AI around the world – estimated at just over $1 trillion by market research firm Gartner – will demand a return.

Bank of America strategist Michael Hartnett says there are two ways this can go: either companies adopt AI without laying workers off, which will lead to pressure on profit margins and share prices, or AI adoption unleashes a productivity-enhancing wave of unemployment.

In the latter scenario, Hartnett argues, “US politicians would move to protect US workers via wealth taxation.”

Protecting jobs

It wouldn’t just be America facing the question of how to protect citizens forced out of the labour force by artificial intelligence. If Comyn’s prediction is correct, and AI disruption arrives in three to five years, the potential erosion of the tax base from AI-related job losses could be impressive; even if 10 per cent of Australian jobs were hit by the technology, that would mean about 1.4 million extra people out of work.

Sound crazy? Remember, Microsoft has already cut 3 per cent of its workforce, and it’s at the very start of this journey.

But it’s also important to note these job losses would also collide with a nasty shift in demographics; about 80,000 Baby Boomers will turn 80 in 2027, with the number of new octogenarians each year hovering at about 60,000 for the next 20 years, according to modelling by the Australian Bureau of Statistics.

This combination means the oft-repeated suggestions that Labor’s tax on unrealised super gains in super funds with over $3 million will eventually morph into other wealth taxes – higher capital gains taxes or even inheritance taxes – may well prove prescient.

But the pressure to tax wealth would be bipartisan; as Bank of America’s Hartnett says, the combination of fewer individual taxpayers and the costs associated with an ageing population may mean there are few alternatives.

In a world filled as it currently is with uncertainty, it’s hard for workers, investors, politicians and business leaders to grapple with the risks and opportunities AI will bring; the hype has been overdone, the impacts on our daily lives are currently pretty minor, and it’s hard to shake the feeling of helplessness – AI will happen to Australia, and there’s not much we can do about it.

But if nothing else, it’s important to realise the potential scale of the disruption could make today’s big issues – trade dramas, and tax changes – look very irrelevant, very quickly.

Here is the link:

https://www.afr.com/policy/economy/labor-s-3m-super-tax-may-be-start-of-a-wealth-grab-just-ask-chatgpt-20250515-p5lzny

I have to say it is becoming ever harder to know where all this is heading but the curse suggesting “may you live in interesting times” seems to become more of a worry by the week!

It certainly seems that the pace of change is accelerating and keeping up is getting harder and harder.

It is also clear that the impact of this change is going to be wider and deeper that most suspected.

My only advice is to “buckle in, it is going to be a wild ride”!

David.

Sunday, May 18, 2025

I Am Not Sure I Would Be Keen To Know The Result Of This Blood Test!

This seems like a mixed blessing to me!

‘Game changer’ Alzheimer’s blood test cleared in the US

Gerry Smith and Robert Langreth

May 17, 2025 – 10.09pm

US regulators have approved the first blood test to help diagnose Alzheimer’s disease, potentially making it easier to find and treat patients with the mind-robbing disease that affects nearly 7 million Americans.

The test made by Fujirebio Diagnostics a unit of Japan’s H.U. Group Holdings was cleared for people 55 years and older who exhibit signs and symptoms of the disease, the US Food and Drug Administration said in a statement.

It is designed for the early detection of amyloid, a protein that can build up in the brain and is a hallmark of Alzheimer’s, the most common form of dementia in the elderly.

The development and approval of blood tests that can spot which patients are likely to have toxic amyloid in their brains has been viewed as a critical step toward making drugs to treat the condition more widely accessible.

While the test is approved for people who are already exhibiting signs of cognitive impairment, studies show amyloid begins accumulating in the brains of some patients years before symptoms begin.

Howard Fillit, cofounder and chief science officer at the Alzheimer’s Drug Discovery Foundation, called the approval “a major milestone for patients and clinicians.”

“The ability to diagnose Alzheimer’s earlier with a simple blood test, like we do for cholesterol, is a game changer, allowing more patients to receive treatment options that have the potential to significantly slow or even prevent the disease,” Fillit said in a statement.

To qualify for drug treatment, Alzheimer’s patients now generally get a specialised PET scan to detect amyloid in their brains or undergo a cerebrospinal fluid test. The PET scans are expensive and require specialised equipment, while the spinal fluid tests involve an invasive procedure.

The need for these tests has slowed the rollout of new Alzheimer’s drugs like Leqembi from Eisai and Biogen and Eli Lilly & Co’s Kisunla.

The FDA approval was a “much-needed win” for the companies whose treatments have struggled to gain traction due to logistical hurdles, said Evan Seigerman, an analyst at BMO Capital Markets.

“Not a sea change, but today’s announcement could start to help these franchises gain some more momentum,” Seigerman wrote in a note to clients.

Fujirebio’s newly approved test, called Lumipulse, only requires a blood draw, making it less invasive and potentially easier for patients to access. It’s unclear how much it will cost or when it will be available. It’s intended for patients at a specialised care setting who are experiencing cognitive decline, according to the FDA.

The blood test shouldn’t be used alone to diagnose the disease, in part because of the risk of false positive or negative results, the agency said. Other clinical evaluations and additional tests should be used to determine treatment options, it said.

Here is the link:

https://www.afr.com/companies/healthcare-and-fitness/alzheimer-s-blood-test-cleared-in-the-us-in-game-changer-20250517-p5m033

This is a test I would hope I never need and, if positive, that I would be sufficiently far gone not to be able to understand or care what the result was!

I don’t know how others feel but I would hope that I would be past caring what the result was when the test was done, and found to be positive. I guess the point of doing the test is to exclude other (treatable) causes of dementia so sensible conservative care can be initiated and the individual made as comfortable as possible while awaiting the inevitable.

Where do others see a blood test of this sort fitting in, if at all?

David.

AusHealthIT Poll Number 794 – Results – 18 May 2025.

Here are the results of the recent poll.

Do Labor Or The Liberals Do Better Running The National Health System

Labor Is Better                                                                 14 (64%)

Liberal Is Better                                                                 6 (27%)

I Have No Idea                                                                   2 (9%)

Total No. Of Votes: 22

It seems clear that most think Labor is better at running the Health System.

Any insights on the poll are welcome, as a comment, as usual!

Pathetic voter turnout – answer must have been too easy. 

2 of 22 who answered the poll admitted to not being sure about the answer to the question!

Again, many, many thanks to all those who voted! 

David.