Again, in the last week, I have come across a few reports and news items which are worth passing on.
These include first:
World Wide Web Consortium's SPARQL query technology published; Semantic Web could impact Google, Internet ad models, analyst says
Paul Krill (InfoWorld) 16/01/2008 08:12:19
The Semantic Web, a concept tossed around for years as a Web extension to make it easier to find and group information, is getting a critical boost Tuesday from the World Wide Web Consortium (W3C).
W3C will announce publication of SPARQL (pronounced "sparkle") query technology, a Semantic Web component enabling people to focus on what they want to know rather than on the database technology or data format used to store data, W3C said.
The potential of the Semantic Web cannot be underestimated. By scanning the Web on behalf of users, even Google's ad-based business model could be impacted, an analyst said.
SPARQL queries express high-levels goals and are easier to extend to unanticipated data sources. The technology overcomes limitations of local searches and single formats, according to W3C.
"[SPARQL is] the query language and protocol for the Semantic Web," said Lee Feigenbaum, chair of the RDF (Resource Description Framework) Data Access Working Group at W3C, which is responsible for SPARQL.
Already available in 14 known implementations, SPARQL is designed to be used at the scale of the Web to allow queries over distributed data sources independent of format. It also can be used for mashing up Web 2.0 data.
The Semantic Web, the W3C said, is intended to enable sharing, merging, and reusing of data globally. "The basic idea of the Semantic Web is take the idea of the Web, which is effectively a linked set of documents around the world, and apply it to data," Feigenbaum said.
Continue reading the quite long article below
This is an important announcement from the World Wide Web Consortium as it flags progress towards making all sorts of disparate information sources more easily searchable and accessible. The range of possible applications to the e-Health sector information silos are obvious!
Second we have:
Paul Ramadge, San Diego
January 16, 2008 - 11:09AM
World-best collaborative research between Australian and United States universities has taken a giant leap forward with the successful launch today of a 1Gigabit per second data connection between the two countries.
The ultrabroadband optical-fibre link - roughly 250 times faster than the standard broadband connection offered in metropolitan Melbourne - was demonstrated at the University of California San Diego and at the University of Melbourne today.
Using large visual-display walls of high-definition screens in both cities, still images, audio, animations and video from Australian research conducted by neuroscientist Professor Graeme Jackson and water researcher Professor John Langford were presented in both cities at the same time.
Participants in San Diego were able to question Professor Langford and Professor Jackson in real time - as if they were in the same room.
The potential applications that will flow from the new technology are immense - from research into the brain using scans that can be shown at the cellular level through to drug discoveries and collaboration on high-end climate change research.
Excited researchers are already talking about sharing data from MRIs, synchrotrons, supercomputers and telescopes to interpret a range of complex data - previously beyond the reach of those in Australia.
The high-speed connection - the power of which will not be lost on those in the Australian community begging for next-generation broadband services - is a joint initiative of the Australian American Leadership Dialogue, the University of Melbourne, the California Institute for Telecommunications and Information Technology at UCSD and the University of California Irvine, the Victorian Government and Australia's Research and Education Network (AARNet).
Continue reading here:
This is an interesting report showing just part of the potential of really fast Internet connectivity. Clearly in the future this sort of connectivity will mean the need to travel around the world for expert clinical advice will slowly become a thing of the past – among a zillion other possible applications.
Third we have:
15 January 2008 04:12 PM
Australian citizens will be assigned a unique identifying number to help healthcare providers protect their patients from accidentally being given the wrong treatment.
Australians' Medicare records will be accessed to create the "Unique Health Identifiers" (UHI), under an initiative announced by minister for Health, Joe Ludwig.
While Medicare will be responsible for the design, building and testing of the UHI system, Australia's National E-Health Transition Authority (NEHTA) will coordinate the project to collect information needed to develop the identifiers, as well as develop requirements for an identity management system.
The system is meant to resolve the limitations of current identifiers -- name, sex, address and date of birth -- which has led in some instances to the wrong test results being applied to a patient, according to an earlier NEHTA report.
At present, medical service providers such as community GP clinics, pharmacies, private and public hospitals have diverse methods and systems to identify individuals, which can potentially lead to the mis-allocation of tests and treatment. Likewise, medical provider information is often stored on disparate systems.
Continue reading here:
and we also have this
Posted 1 hour 33 minutes ago
The Queensland Branch of the Australian Medical Association (AMA) says a new electronic healthcare identification service could save doctors hundreds of hours which are normally wasted writing prescriptions.
The Federal Government has signed a contract to develop and test the national scheme, which would electronically identify a person's name, date of birth, address and the names of their healthcare providers.
AMA Queensland spokesman Dr Wayne Herdy says the system would improve efficiency and guarantee correct information is transferred between private practice and hospitals.
"We spend a lot of time writing prescriptions and sending prescriptions to pharmacies and writing them out by hand, or having to sign pieces of paper," he said.
Continue reading here:
I have no idea just what those who are briefing these journalists are smoking but to attribute reduced prescription error rates and saved time in prescribing to having a patient identifier is really stretching it. It is the applications – yet to be developed and deployed – that will use the identifier that may help..not the fact of the identifier. More lives would be saved by having quality GP system with good up-to-date decision support than are likely to be saved by the identifier alone. It is simply a piece of IT infrastructure which should have been in place a decade ago.
Fourthly we have:
Standardisation of systems needed to cut costs, says Commission
Matt Chapman, vnunet.com 10 Jan 2008
The European Commission has criticised the European health sector for lagging behind when it comes to technology.
The Commission's Lead Market Initiative report said that a gap had been created because investment had been ploughed into other areas and not into e-health.
"Healthcare has fallen progressively behind other service sectors over the past 25 years in terms of relative levels of ICT investment," the report said.
"European citizens would greatly benefit from cost reductions, coupled with better efficiency of the healthcare systems through the wider development of e-health," the report said.
The study also claimed that improvements to health technology would see systems used as "tools" for health authorities and "personalised health systems " for patients.
Health costs in the EU currently run at nine per cent of gross domestic product, but the report expects this to rise to 16 per cent by 2020 thanks to ageing populations.
Read the full article here.
The expected rise in healthcare costs in the EU by 2020 is a little alarming!
Fifthly we have:
Report lauds VA's focus on quality care, health IT
The Veterans Affairs Department has improved its quality of health care through management initiatives and use of health information technology, the Congressional Budget Office said in an interim report. VA's accomplishments come during a period of increased demand for its services from soldiers returning from Afghanistan and Iraq.
VA has restructured efforts to permit more shared decision-making among its central office, regional managers and facility directors; measure performance, process and outcomes; and use health IT systemwide.
The department's integrated structure and appropriated funding may have helped it focus on providing the best quality care for a given amount of money compared with fee-for-service incentives toward billable services and procedures, CBO said in the Jan. 9 report.
The improvement in VA's health care quality has been documented in a number of independent studies, including by the Institute of Medicine. VA will provide care to more than 5.8 million veterans this year in its 153 hospitals and nearly 900 clinics.
VA tracks the quality of its care using indicators such as adherence to clinical guidelines and standards that have been shown to improve outcomes, waiting times for access to services and customer satisfaction. This year, VA plans to adopt more industrywide quality measures, such as those in the Healthcare Effectiveness Data and Information Set, to boost comparability with other providers, CBO said.
Continue reading here:
The report can be found by clicking the following link
Confirming this finding is research undertaken for the Welsh Health Department when reviewing the progress of the Welsh Health IT Strategy. To quote
“The proven experiences from Veterans Administration and Kaiser Permanente as well as others such Andalucia in southern Spain, clearly demonstrates that the Electronic Health Record is not only very useful, it is a necessity if improved clinical outcomes and patient safety is to be achieved. We should sit up and take notice when an organization as large as the VA is able to show that: a) their cost of care per patient day has stayed the same for over 10 year while it has risen by 40% for everyone else and, b) that they are the top of table for all the quality health indicators currently being used.
In the past 10 years, the VA has increased the number of patients treated by 34%, decreased staffing by 15%, and opened over 300 community based patient-centred primary health care clinics -- with no increase in budget! But, it came at a price; the benefits that information technology generated for the VA only came when clinical workflows and processes were changed and optimised. This often meant bringing down boundary barriers and changing rules and regulations. The Dutch approach to this phenomenon is intriguing: stimulate – facilitate – obligate.”
Source: “An assessment of Informing Healthcare in Wales – International Advisory Group -September 2007”
This is an important! If ever there was proof of Health IT and decent management making a difference in the real world for the better this is it. Pity our politicians are yet to get it.
Lastly we have:
Making the Rounds With Robo Doc
Tuesday, January 15, 2008; HE02
A white-coated mobile robot may seem like something out of a sci-fi movie, but one of these gizmos may one day ask how you're feeling and listen to your reply. Some physicians -- like Joseph Patelin, of Shawnee Mission, Kan., whose face is shown above -- are using monitor-mounted robots to check on patients.
A study in last month's Archives of Surgery found that robo docs "matched the performance" of the flesh-and-blood variety with 270 urology patients. Compared to traditional bedside checkups, robot rounds didn't increase complications after surgery, lengthen hospital stays or prompt more patient complaints.
Continue reading here:
Interesting study – similar results have been found with remote supervision of ICU patients – and I love the robot dressed up in a white coat as shown in the picture.
More in next week.