Despite the onset of the Trump Administration all sorts of advanced and exciting material was published.
From Nature we got:
How bioinformatics tools are bringing genetic analysis to the masses
Computational biologists are starting to develop user-friendly platforms for analysing and interpreting genetic-sequence data.
28 February 2017
For doctors trying to treat people who have symptoms that have no clear cause, gene-sequencing technologies might help in pointing them to a diagnosis. But the vast amount of data generated can make it hard to get to the answer quickly.
Until a couple of years ago, doctors at US Naval Medical Research Unit-6 (NAMRU-6) in Lima had to send their sequence data to the United States for analysis, a process that could take weeks — much too long to make pressing decisions about treatment. “If all you could do was get the data that you then have to ship to the US, it's almost useless,” says Mariana Leguia, who heads the centre's genomics and pathogen-discovery unit.
But Leguia no longer has to wait for the analyses; she can get results in days or even hours — and she can do them in her own lab. Her unit makes use of EDGE (Empowering the Development of Genomics Expertise), a bioinformatics tool that hides common microbial-genomics tasks, such as sequence assembly and species identification, behind a slick interface that allows users to generate polished analyses. “We can have actionable information on site that allows us to make decisions very quickly about how to go forward,” Leguia says.
More here:
From Digital Health in the UK we got:
How will artificial intelligence change radiology?
Artificial intelligence and cognitive computing is being heralded as the brave new frontier of clinical IT, Kim Thomas reports on how it is already beginning to reshape radiology imaging and diagnostics.
Jon Hoeksma
Editor, Digital Health
IBM chose December’s annual meeting of the Radiological Society of North America to showcase the ability of its Watson supercomputer to rapidly analyse medical images and suggest a diagnosis. Mark Griffiths, a clinical radiologist at University Hospital Southampton NHS Foundation Trust, who attended RSNA, says he saw some “stunning demonstrations” of the technology, including chest X-rays being “reported in milliseconds.”
Watson is an example of a technology that IBM refers to as “cognitive computing”. Using a form of artificial intelligence known as natural language processing, Watson, a cloud-based system, is able to analyse vast stores of scholarly articles, patient records and medical images. (When IBM acquired Merge Healthcare in 2015, it gained access to the company’s database of 30 billion images.) This ability to interpret written language is what marks Watson out as different from other computer-based tools used to aid diagnosis.
Not enough radiologists to meet demand
In England, the volume of radiology images taken has increased at the rate of 3.6% a year for 20 years, and there are not enough radiologists to meet demand. Could Watson – and other AI tools – provide a solution to the problem of overstretched radiology departments? And – as some fear – could it replace radiologists altogether?
More here:
From Healthcare IT News we got:
Health Catalyst, Regenstrief partner to commercialize natural language processing technology
The companies said they intend to put the artificial intelligence-powered text analytics technology to work accelerating advances in patient care.
February 27, 2017 01:10 PM
Regenstrief Institute CEO Peter Embi, MD, said the deal will help patients benefit from unstructured data.
Health Catalyst and the Regenstrief Institute are working together to commercialize nDepth, Regenstrief’s natural language processing technology.
nDepth is an acronym for NLP Data Extraction Providing Targeted Healthcare. Indianapolis-based Regenstrief developed the technology to harness unstructured data.
Salt-Lake City-based Health Catalyst, a data warehousing and analytics company, has been in the business of extracting data to boost care quality since it launched in 2008.
Regenstrief’s nDepth is artificial intelligence-powered text analytics technology. It was developed within the Indiana Health Information Exchange, the largest and oldest HIE in the country.
Regenstrief fine-tuned nDepth through extensive and repeated use, searching more than 230 million text records from more than 17 million patients.
Lots more here:
and from Health Affairs we got:
The Future Of Precision Medicine: Great Promise, Significant Challenges
February 28, 2017
Editor’s note: This post is part of a series stemming from the Fifth Annual Health Law Year in P/Review event held at Harvard Law School on Monday, January 23rd, 2017. The conference brought together leading experts to review major developments in health law over the previous year, and preview what is to come.
In his 2015 State of the Union address, President Obama launched the Precision Medicine Initiative (PMI), which is intended to help move medicine from the traditional “one-size-fits-all” approach where treatments are designed for the “average” patient, to one that “takes into account individual differences in people’s genes, environments, and lifestyles,” thereby personalizing treatment. According to the White House, a major goal is to “bring us closer to curing diseases like cancer and diabetes.” In December 2016, the 21st Century Cures Act was signed into law, authorizing up to $1.455 billion in funding for the initiative, spread over 10 years (although, importantly, the statute does not guarantee any of the funds, which will be subject to budget negotiations each year).
Central to the PMI is the All of Us Research Program (renamed in October 2016 from the “Precision Medicine Initiative Cohort Program”), which aims to enroll 1 million or more volunteers throughout the United States. If successful, it would be one of the largest longitudinal cohorts ever developed in this country. The All of Us program will, among other things, seek to ascertain the relationships between various environmental exposures, genetic factors, and other biologic determinants of disease. Volunteers will contribute data in various ways, including donating blood samples, completing baseline physical exams and online health surveys, and sharing both existing electronic health records and mobile health data. Enrollment will be possible either directly via smart phone applications developed by a Participant Technologies Center, or through participating medical centers, including community health clinics and medical centers operated by the US Department of Veterans Affairs. A Data and Research Center will acquire, organize, and provide secure access to volunteers’ health data, and the Mayo Clinic will manage a biobank of volunteers’ biological specimens to support research efforts.
Lots more here:
All I can say is that we are all going to have to work a whole lot harder to keep up!
David.
1 comment:
This really is a beginning of an interesting new chapter in medicine and one information technologies have helped shape and are made for each other. The MyHR was a good evolutionary step in eHealth, but it like many other evolutionary steps should be brought to a end and new purposes defined to these emerging challenges so that important information is captured stored and share with machine processing as core purpose underpinning new design work. Else as you say David we will be left behind.
Turning your back on national and international standards efforts is not a step in the right direction, nor is simply attending of speaking in forums and rubbing shoulders with actual thought leaders, the hard craft starts in the standards world where common sets of agreements can form the basis of standardised underpinning to solutions...... etc..
Post a Comment