Quote Of The Year

Timeless Quotes - Sadly The Late Paul Shetler - "Its not Your Health Record it's a Government Record Of Your Health Information"

or

H. L. Mencken - "For every complex problem there is an answer that is clear, simple, and wrong."

Wednesday, June 05, 2024

It Looks Like There Is A Battle Raging For The Soul Of AI Going Forward.

 

This appeared last week:

Australian at the centre of the high-stakes battle over AI has a warning for the world

June 1, 2024 — 5.30am

There’s a war raging about the future of artificial intelligence – the technology that is already disrupting our economy, our jobs, and our social media feeds – and Melbourne-born Helen Toner is at its centre.

Toner, who grew up in Southeast-Melbourne, was ousted as a board member of ChatGPT maker OpenAI last year after an unsuccessful attempt to force out its chief executive Sam Altman.

At the crux of the bloodletting is a war over AI safety and ethics – as well as a clash of personalities – that will shape the future of the generative AI sector as well as society more broadly.

Helen Toner in May cautioned against over-relying on AI chatbots, saying there was “still a lot we don’t know” about them.

AI’s promise is magical, offering massive benefits in productivity, efficiency and creativity. But its potential pitfalls are just as severe. Some Australians are already losing their jobs to AI, with workers in data input and sales among the most affected. Others are using the technology to be more efficient. And the nation’s parliamentarians and regulators are being left in the dust.

Toner believes that AI’s roll-out should not be left to private technology companies like OpenAI, the company that released the wildly popular chatbot ChatGPT less than two years ago.

“It would be a mistake to rely on corporate self-governance structures to handle all the challenges of AI,” Toner said in an interview with this masthead.

“The challenge with AI and making policy for it is that it’s a general-purpose technology. It can be used in literally every sector, for literally everything ... But AI systems are already causing harm. Robodebt is an example of an extremely basic system that caused a lot of harm to a lot of people.

“Personally, I am pretty worried about some of the worst cases from AI. But I also worry they’re so sexy and attention-grabbing that they end up sucking all the oxygen and not leaving room for all the other issues.”

And there are plenty of issues, for a technology that has raced into our workplaces, newspapers, songs and movies seemingly in the blink of an eye.

Toner, who is in her early 30s and is an AI policy researcher, graduated from Melbourne Girls Grammar with a perfect VCE score. She joined OpenAI’s board in late 2021 after stints in China, where she studied its AI industry, and Washington DC, where she helped form Georgetown’s Centre for Security and Emerging Technology, a think tank focused on AI and national security, where she still works today.

Her subsequent departure from OpenAI’s board was widely characterised at the time as a showdown between ethics and profits. Between slowing down or speeding up.

Instead, Toner says there was board mistrust and that Altman had created a toxic atmosphere; claims that Altman and board chair Bret Taylor have denied.

For Toner, it is critical that governments – including Australia’s – play an active role and tech companies not be left to their own devices or trusted to self-regulate what’s quickly becoming a massively important sector.

As of right now, however, it’s a losing argument.

This month, Google’s AI-based search variously told users to eat at least one small rock a day, to thicken pizza sauce using 1/8 of a cup of non-toxic glue, and to stare at the sun between five and 15 minutes a day.

It’s unpredictable technology that clearly isn’t ready for prime time, but it doesn’t matter.

We’re quickly entering an era in which technology companies - predominantly US-based heavyweights like Google, Meta, Nvidia and OpenAI - are racing to build generative AI into every product and service we use, even if the results are wrong or nonsensical.

Companies like Google and Meta are hoping generative AI will supercharge their platforms, making them far more engaging – and useful – than they were before. And there’s a lot of money at stake: it is estimated generative AI will be a $2 trillion market by 2032.

Most of Google’s billions of global users may not have used a chatbot before, but will soon be exposed to AI-generated text in its answers. Similarly, many of the images you scroll through on Facebook, or see in the pages of The Daily Telegraph, are now generated by AI.

This week, an image spelling out “All Eyes on Rafah” was shared by more than 40 million Instagram users, many of whom would have had no idea it was likely generated by artificial intelligence.

AI’s rapid ascent into the zeitgeist is reminiscent of bitcoin’s rise five years ago. As with bitcoin, everyone is talking about it, but no one really understands how it works. Unlike bitcoin, however, generative AI’s potential, as well as its impact, is very real.

According to Toner, no one truly understands AI, not even experts. But she says that doesn’t mean we can’t govern it.

“Researchers sometimes describe deep neural networks, the main kind of AI being built today, as a black box,” she said in a recent TED talk. “But what they mean by that is not that it’s inherently mysterious, and we have no way of looking inside the box. The problem is that when we do look inside, what we find are millions, billions or even trillions of numbers that get added and multiplied together in a particular way.

“What makes it hard for experts to know what’s going on is basically just, there are too many numbers, and we don’t yet have good ways of teasing apart what they’re all doing.”
How AI works

The deep neural networks are complex systems that power large language model chatbots like ChatGPT, Gemini, Llama and Lamda.

They’re effectively computer programs that have been trained on huge amounts of texts from the internet, as well as millions of books, movies and other sources, learning their patterns and meanings.

As ChatGPT itself puts it, first you type a question or prompt into the chat interface. ChatGPT then tokenises this input, breaking it down into smaller parts that it can process. The model analyses the tokens and predicts the most likely next tokens to form a coherent response.

It then considers the context of the conversation, previous interactions, and the vast amount of information it learned during training to generate a reply. The generated tokens are converted back into readable text, and this text is then presented to you as the chatbot’s response.

Apart from the war over ethics and safety, there is another stoush brewing over the material used to train the likes of ChatGPT. Publishers like News Corp have signed deals to allow OpenAI to learn from its content, while The New York Times is suing OpenAI over alleged copyright infringement.

For now, the chatbots are working with limited datasets and in some cases faulty information, despite rapidly popping up in every classroom and workplace.

A recent RMIT study found 55 per cent of Australia’s workforce are using generative AI tools like ChatGPT at work in some capacity. Primary school teachers are creating chatbot versions of themselves to work with students, and ad agency workers are using ChatGPT to create pitches in minutes, work that would have taken hours.

Parliamentarians are wondering how to react. Some 20 years after Mark Zuckerberg invented Facebook, the Australian parliament is grappling with the prospect of enforcing age verification for social media. Decades into the advent of social media we are still coming to terms with its effects and how we might want to rein it in.

People close to the technology, including Toner, are warning governments to not make the same mistake with AI. They say there’s too much at stake.

Some argue the nation’s parliament is also already years behind grappling with artificial intelligence. Science and industry minister Ed Husic says he is keenly aware of the issue: he’s flagged new laws for AI use in “high-risk” settings and has appointed a temporary AI expert group to advise the government.

Researchers and industry members say those efforts have lacked urgency, however. A senate committee on the adoption of the technology in May heard that Australia has no laws to prevent a deepfake Anthony Albanese or Peter Dutton spouting misinformation ahead of the next federal election.

“I’m deeply concerned at the lack of urgency with which the government is addressing some of the risks associated with AI, particularly as it relates to Australian democracy,” independent senator David Pocock told this masthead.

“Artificial intelligence offers both opportunities and huge risks.”

Pocock wants specific laws to ban election-related deepfakes while others, including Australian Electoral Commission chief Tom Rogers, think codes of conduct for tech companies and mandatory watermarking would be more effective.

Either way, there’s a broad consensus that Australia is far behind other jurisdictions when it comes to grappling with both the risks and opportunities presented by AI. Simon Bush, chief executive of peak technology lobby group AIIA, fronted the Senate hearings and pointed out that Australia ranks second-largest globally in adopting AI across the economy according to several surveys.

“The rest of the world is moving at pace,” he said. “This is a technology that is moving at pace. We are not.”

The most recent federal budget allocated $39 million for AI advancement over five years, which Bush says is a negligible amount compared to the likes of Canada and Singapore, whose governments have committed $2.7 billion and $5 billion respectively.

For Bush, the narrative around fear and Terminator-esque imagery has been too pronounced, at the expense of AI adoption. He wants Australia to help build the technology its citizens will inevitably end up using.

“Australians are nervous and fearful of AI adoption, and this is not being helped by the Australian government running a long, public process proposing AI regulations to stop harms and, by default, running a fear and risk narrative,” he told the senate committee hearing.

Toner says, however, that Australia, as with other countries, should be thinking about what kind of guardrails to put around these systems that are already causing harm and spreading misinformation. “These systems could change pretty significantly over the next five, 10 or 20 years, and how do you get ready for that? That’s definitely something we need to grapple with.”

While Australia dithers, the tech is moving forward whether we like it or not.

Toner wants us to not be intimidated by AI or its developers, and says our collective involvement is crucial in shaping how AI technologies are used. “Like the factory workers in the 20th century who fought for factory safety, or the disability advocates who made sure the World Wide Web was accessible, you don’t have to be a scientist or engineer to have a voice.”

The very first step, for Toner, is to start asking better questions. “I come back to this question of, ‘is it just hit the accelerator or the brakes’. Or you know, are we thinking about who is steering? How well does the steering work, and how well can we see out of the out of the windscreen? Do we know where we are, do we have a good map?

“You know, thinking about all these kinds of things, as opposed to just floor it and hope for the best.”

Here is the link:

https://www.smh.com.au/technology/australian-at-the-centre-of-the-high-stakes-battle-over-ai-has-a-warning-for-the-world-20240528-p5jh5v.html

I have to say that what I see here is what I would expect as powerful and complex technologies are rolled out and we have contention between those who want to rush forward and those who are keen on a slower, safer and steady path. We have to hope somehow the balance will be found where the pace of progress is measured but rapid enough!

We really do live in very interesting times!

David.

Tuesday, June 04, 2024

I Had Not Realized This Has Become As Acute A Problem As It Clearly Is…

 

This appeared last week:

Calls for a national register to stem the tide of doctor suicides

EXCLUSIVE
By Natasha Robinson

Health Editor

3:21PM June 1, 2024

Leading doctors have called for the introduction of a national register of healthcare worker suicides amid indications that the true rate of doctors and nurses taking their own lives is much higher than statistics indicate.

Anti-suicide campaigner and cardiologist Geoffrey Toogood says the rate of clinicians dying amid unsustainable and growing pressures in health systems is shockingly high.

There are no official national statistics recorded on healthcare worker suicide. Peer-reviewed literature has assessed the rate of doctors’ suicide as being 34 per cent higher than the general population, based on coronial records. However, Dr Toogood believes the figure is higher.

“I have heard of 12 suicides just in the last six months, in different parts of the country, of different ages, and they are just the ones I have heard about it,” Dr Toogood said.

“You can’t put words into the feelings you have when you hear about it.”

Dr Toogood – who founded the charity CrazySocks4Docs to break down stigma and discrimination around mental health ­issues in medicine – said it was crucial to identify risk factors.

A national register could be linked to a service that could ­investigate the causes behind ­clinician suicide when families consent.

“How else are we going to work out the contributing factors?” Dr Toogood said.

“Suicidality is a very difficult subject, it can be very hard to know what has actually tipped that person over the edge at that time, but we really need to work out what is going on.”

The push for health system reform to protect doctors and nurses was given fresh impetus a week ago when Australian Medical ­Association president Steve Robson detailed his own suicide ­attempt 35 years ago and highlighted medicos’ tendency to be enormously self-critical and reluctant to seek help, coupled with sometimes dangerous working conditions amid medicine’s often “toxic culture”.

The article highlighted grave risks to doctors under investigation by the health regulator. The Australian Health Practitioners Regulation Agency’s chief executive Martin Fletcher responded this week, saying the issue was of the highest importance and staff were working to lessen the intense stigma currently associated with being the subject of a notification, as well as working with Professor Robson to further combat the issue.

“I think it’s incredibly important that we do everything we can to widen the gap between somebody thinking about harming themselves, and acting on that,” Dr Fletcher said.

“The stress that people feel they’re under in the health system is significant. Certainly we hear a lot of concerning stories about people feeling completely desperate. We really welcome a light being shone on this.

“We absolutely acknowledge our part in this.”

Ophthalmologists will join the call to better protect doctors’ mental health at a leaders’ form on Saturday ahead of Crazy­Socks4Docs day next week. The specialty has lost members of its ranks this year to suicide.

“Like all medical communities across the country, ours has not gone untouched, and we have ­experienced the loss of esteemed colleagues and friends to this ­silent pandemic,” said Australian Society of Ophthalmologists president Peter Sumich.

Townsville doctor Sarah Kleinman said the town had lost five doctors in the past two years. “Since I graduated I’ve just come to expect that one or two of my colleagues will die ever year and I’m getting tired of us accepting that that’s just the norm,” Dr Kleinman said.

“It’s time for us to stop having the old standard of ‘physician heal thyself’.”

Here is the link:

https://www.theaustralian.com.au/nation/calls-for-a-national-register-to-stem-the-tide-of-doctor-suicides/news-story/50643a8db92d02264f691c008c43a72f

Doctor suicide is a very complex issue in part aggravated by public expectations of doctors as well as the access doctors have to the tools to carry out a suicidal impulse . Sadly medical training means you are equipped with the knowledge and tools on how to end it all….

The other aspect of this to me is the level of expectation many doctors work under with the risk that on occasion they will feel they have fallen short and act out accordingly.

Doctors are also, as a group, rather less inclined to seek help than others – noting that suicide is a cause of concern in most professions,

A complex and very sad issue!

David.

Sunday, June 02, 2024

I Somehow Think The Chicken Has Flown The Coop On This!

This appeared last week:

E-cigarette use rising in NSW despite vapers saying they want to quit

The number of people using vapes in NSW increased last year, but survey data suggests more than half are considering quitting in the next six months.

According to figures from the latest NSW Population Health Survey, published by the Ministry of Health on Friday, almost 19 per cent of people aged 16 to 24 identify as a current user of vapes, also known as e-cigarettes.

Self-reported vaping rates increased in every age group in the 2022-23 survey. However, among under-35s – who have the highest vaping rates – the increase in uptake was smaller than the previous year.

Federal and state governments are cracking down on vaping, amid concerns from health bodies that the devices have addicted a new generation to nicotine.

On January 1, the federal government banned importation of disposable vapes, commonly sold at convenience stores, regardless of whether they contained nicotine. Further import restrictions were introduced in March, seeking to limit vaping to solely a smoking cessation aid, prescribed by a doctor.

In the first quarter of 2024, NSW Health seized more than 124,000 illegal nicotine e-cigarettes from retailers.

The efficacy of these measures remains to be seen, but data from a second survey conducted by state health authorities, Cancer Institute NSW’s Smoking and Health Survey, suggests vapers are considering quitting their habit at the same rate as smokers.

Fifty-one per cent of vapers and 55 per cent of smokers were considering quitting in the next six months. One in five people who vaped said they had thought about quitting daily over the past two weeks.

The survey, conducted in mid-2023 and also published this week, interviewed 1200 adults from across the state, including smokers, vapers and those who did not smoke or vape.

It was the first time the survey, also conducted annually, had asked about vaping habits.

Cancer Institute NSW CEO Professor Tracey O’Brien said it was great to see that vapers were considering quitting at the same rate as smokers, as both could cause considerable health harms.

She said she was also particularly pleased to see more than 80 per cent of people aware of vapes agreed they were unsafe to use, up for 73 per cent in 2021.

“[It is] a sentiment we hope will continue to grow,” she said.

“There are ongoing efforts in NSW to educate the community about the harms of smoking and vaping and I applaud everyone working to prioritise their health and wellbeing.”

However, O’Brien said health authorities could not be complacent as vaping rates continued to increase, particularly among young people.

“Like cigarettes, vapes are also full of harmful chemicals that have been known to cause cancer and there is growing evidence that young people who vape are more likely to take up smoking, which can significantly increase their cancer risk,” she said.

“We are very concerned that a new generation of people will become addicted to smoking if vaping use continues to increase in young people, which is why it’s important that people avoid taking up vaping or seek help to quit.”

The Cancer Council NSW’s Generation Vape survey of teenagers aged 14 to 17 suggests a third have tried a vape, half of whom had never previously smoked a cigarette.

Last year, six young people presented to NSW emergency departments with symptoms including seizures, loss of consciousness and vomiting, after using vapes purchased on Snapchat. Department incident logs show state schools have dealt with several cases of students selling vapes.

A report on vaping published by the NSW Advocate for Young People in November found high school students wished nicotine vapes were harder to access, and wanted to receive help to quit without fear of punishment.

NSW Health Minister Ryan Park said he strongly supported the federal government’s vaping reforms.

“The overwhelming advice from medical experts is for a prescription model in which e-cigarettes can only be obtained for medical purposes,” he said.

“It’s encouraging to see that among young people who vape there is a strong desire to quit.”

Here is the link:

https://www.smh.com.au/national/nsw/e-cigarette-use-rising-in-nsw-despite-vapers-saying-they-want-to-quit-20240531-p5ji84.html

Given the potency, competitive cost  and wide distribution of vapes it is hard to believe that the cat has not managed to slip out of the bag on this one. There is just too much money to be made by the illegal sellers and importers of vapes to think they will just give up this lucrative little side-line!

I am prepared to bet that if we come back in five years vapes will be well more common than today.

Does anyone think I will lose money?

David.

AusHealthIT Poll Number 749 – Results – 02 June 2024.

 Here are the results of the poll.

Should The Federal Government Be Doing More To Prevent Prescription Drug Shortages Like The One We Are Experiencing At Present?

Yes                                                                             29 (97%)

No                                                                                  1 (3%)

I Have No Idea                                                              0 (0%)

Total No. Of Votes: 30

A very clear cut vote suggesting more needs to be done!

Any insights on the poll are welcome, as a comment, as usual!

A very good voting turnout. 

0 of 30 who answered the poll admitted to not being sure about the answer to the question!

Again, many, many thanks to all those who voted! 

David.