Wednesday, October 22, 2014

Eliminating unnecessary imaging from cardiac stress testing could cut costs

A new report finds that radiological imaging in cardiac stress testing is overused; cutting back could reduce health costs and future cancer cases.

by John Tyburski
Copyright © Daily Digest News, KPR Media, LLC. All rights reserved.

AUTHOR’S NOTE: The authors of the published research report contacted me directly to request news coverage.


Technology has advanced modern medicine but at a cost, both to health care expenditures and to a patient’s long-term health status. For example, imaging techniques that use radiation to provide glimpses of internal structures offer minimally invasive approaches to finding health issues. However, they also expose patients to a small doses of ionizing radiation, including the kind used in X-rays and CT scans, which, in rare cases, can cause cancer down the road.

A team of researchers led by assistant professor Joseph Ladapo of New York University’s Departments of Medicine and Population Health examined whether the trade-offs specifically associated with cardiac stress testing were being adequately considered in decisions to include imaging. They found, in fact, that imaging may be overused, creating unnecessary health care spending and increased risk of cancer in those tested. The results of their study were published in the October 8 issue of the Annals of Internal Medicine.

“The key finding of our study is that cardiac stress testing with imaging has grown briskly over the past two decades in the US, and that about 1 million stress tests with imaging are probably inappropriate,” states Ladapo. “These inappropriate tests cost us about half a billion dollars in healthcare costs annually and lead to about 500 people developing cancer in their lifetime because of radiation they received during that cardiac stress test.”

Cardiac stress tests assess the amount of stress one’s heart can tolerate before it exhibits signs of overburden such as abnormal rhythm or inadequate blood flow within itself. Tests with imaging include the injection of a radioactive tracer into the blood. The tracer emits small amounts of radiation that is then imaged in order to identify areas of compromised blood flow.

Ladapo and colleagues used data on patients with no coronary heart disease who were referred for cardiac stress testing, gathered from the National Ambulatory Medical Care Survey and the National Hospital Ambulatory Medical Care Survey. They compared the number of stress tests ordered with and without imaging in the period 1993 to 1995 with the number with and without imaging in the period 2008 to 2010.

The researchers found that the annual number of cardiac stress tests ordered in the U.S. increased by over 50 percent from 1993–1995 and 2008–2010 and attribute this increase to changes in population and provider characteristics. Cardiac stress tests with imaging comprised 59 percent of the total tests in 1993–1995 compared to 87 percent in 2008–2010. By referencing the most recent appropriate use criteria, the researchers determined that 34.6 percent of the tests done with imaging in 2008–2010 really did not warrant imaging.

Using published methods, the researchers estimate that this overuse of imaging in the period 2008–2010 cost $501 million. They predict, again based on published methods, that 491 future cancer cases will be attributable to inappropriate imaging with cardiac stress testing done in this period.

“Cardiac stress testing is an important clinical tool,” says Dr. Ladapo, “but we are over using imaging for reasons unrelated to clinical need.”

The report comes quickly on the heels of a recent scientific statement by the American Heart Association on improving informed decisions that involve both doctors and patients when it comes to radiation-based imaging. Ladapo and his colleagues also looked for but found no evidence that patient ethnicity plays any significant role in whether a cardiac stress test with imaging is ordered.

Spacecraft ‘Rosetta’ to make historic landing onto comet



The European Space Agency’s Rosetta spacecraft has been cleared to send a landing craft to a predetermined spot on comet 67P next month.

by John Tyburski
Copyright © Daily Digest News, KPR Media, LLC. All rights reserved.


It is comet-landing time. Officials at the European Space Agency have approved of a plan for its Rosetta comet chaser to deploy a special lander on comet 67P/Churyumov-Gerasimenko, an object that Rosetta has been stalking for the past two months.

Comets are thought to be ancient remnants of the formation of our solar system, and the decision, made Tuesday, is one for attempting to reach a major milestone in space exploration. The lander, called Philae, will descend to a specific location on the comet called Site J, which is on the smaller of the two obvious lobes of the 2.5-mile-wide comet. The colloquial name “rubber duck” comes from the rubber duck-like shape of the comet, and the landing site will be on the “head.”

“Now that we know where we are definitely aiming for, we are an important step closer to carrying out this exciting — but high-risk — operation,” Fred Jansen, Rosetta mission manager, said in a statement.

The mission team put Rosetta to within 62 miles of the comet on August 6 and has steadily moved it in closer to within six miles. According to the plan to contact the comet’s surface, Rosetta will be brought back out to about 14 miles away, and then Philae will be launched. On its way in, Philae will capture moment-by-moment images and samples of the dust, gas, and plasma. The lander will take about seven hours to travel between Rosetta and the comet.

Once on the surface, Philae will anchor itself and perform tests with its 10 different solar-powered instruments before it gets too hot from the sun in March 2015. Rosetta will continue to shadow the comet as it approaches and passes the sun in August 2015 on its way to the outer solar system again.

Throw out your compass: Earth’s magnetic poles to flip sometime this century



[The title was written by my editor.]

Scientists believe that Earth’s magnetic field has inverted many times and is possibly due for another flip, perhaps even within a lifetime from now.

by John Tyburski
Copyright © Daily Digest News, KPR Media, LLC. All rights reserved.


Although perhaps difficult to comprehend, the Earth’s magnetic field may soon reverse in its polarity. It has, after all, done so many times in history. So says an international team of geologists after examining evidence of a past reversal and indications that the planet’s magnetic field today is growing weaker.

The researchers, who are based in Italy, France, and here in the U.S. at Columbia University and the University of California in Berkeley, found evidence that the most recent planetary magnetic pole reversal occurred approximately 786,000 years ago and happened over a period of less than 100 years.

“It’s amazing how rapidly we see that reversal,” said UC Berkeley graduate student Courtney Sprain in a statement. “The paleomagnetic data are very well done. This is one of the best records we have so far of what happens during a reversal and how quickly these reversals can happen.”

Recent observations suggest that the intensity of Earth’s magnetic field is decreasing an order of magnitude faster than it has in the past. As a result, some geophysicists predict that a field pole reversal could occur within the next few thousand years.

The planet’s magnetic field may reverse in polarity as a result of convection in the iron core. To date, no evidence has been found that indicates any major catastrophes resulting from the last field flip. However, an inversion today could take out electrical grids through unusual current surges. The Earth’s magnetic field shields life on its surface from solar and cosmic radiation so a disruption before an inversion could exert serious adverse effects on the biosphere.

“We should be thinking more about what the biologic effects would be,” remarked Paul Renne, co-author on the report and director of the Berkeley Geochronology Center.

The researchers describe their findings in a report published in the latest issue of the Geophyscial Journal International.

Icebergs in Florida? A discovery in the Atlantic may unravel our explanation of climate change



Scarring of the ocean floor off the Atlantic coast gives indication that enormous icebergs once reached as far south as Florida.

by John Tyburski
Copyright © Daily Digest News, KPR Media, LLC. All rights reserved.


How about something really cool? A new report documents evidence for ancient gigantic icebergs off the Atlantic coast, having once traveled as far south as Florida. The ocean floor off the coasts of South Carolina, Georgia, and Florida is scarred with V-shaped trenches cut by huge icebergs brought down thousands of years ago by staggering arctic flooding from the north.

Approximately 20,000 years ago, giant fresh water lakes covered what is now Canada, and their frigid waters were held in by ice dams. Lakes the size of the Caspian Sea sometimes broke through their dams and flooded southward into the Atlantic ocean, carrying with them fleets of jagged icebergs. The trenches cut along the Atlantic coast suggests the icebergs came as far south as Florida.

This new discovery, published this week in the journal Nature Geoscience, will likely change thw ay scientists understand global warming, ice cap melting, and ocean currents.

“The mechanisms of climate change and ocean currents are more complex than we previously thought,” study author Jenna Hill, a marine scientist at Coastal Carolina University, said in a statement. “It helps us understand how future ice sheet melt from Greenland may affect global climate.”

Co-author Alan Condron said that the ocean floor scours indicate that the icebergs may have been up to 1,000 feet thick, the size of those one can only find off the coast of Greenland today.

“Previous research would have suggested the meltwater would have gone much further north, so people weren’t expecting the subtropics to become fresher,” said Condron, an oceanographer at the University of Massachusetts, Amherst. “This actually has enormous implications for that model and for what triggers climate change.”

“This new research shows that much of the meltwater from the Greenland ice sheet may be redistributed by narrow coastal currents and circulate through subtropical regions prior to reaching the subpolar ocean,” Condron added.

Quantum computing: the biggest technological breakthrough you didn’t know existed



[The title was written by my editor.]

Australian scientists recently report on the first successful attempt to hold data on silicon quantum chip at over 99 percent accuracy.

by John Tyburski
Copyright © Daily Digest News, KPR Media, LLC. All rights reserved.


The world of computing has held guarded optimism about one day having quantum computing capability. Recently, two teams of Australian scientists propelled the dream a great leap forward by developing the first silicon-based quantum qubits, or bits, the basic information-unit in quantum computing. The accomplishments are described in two recent publications in the journal Nature Nanotechnology.

“We have demonstrated that with silicon qubit we can have the accuracy needed to build a real quantum computer,” said Andrew Dzurak of the University of New South Wales, who is co-author on both papers. “That’s the first time this has been done in silicon.”

Previous work with phosphorous atoms as qubits achieved 50 percent accuracy when used in silicon. One of the new methods built upon these findings but with silicon atom qubits in silicon.

“In natural silicon each atom also has its own spin which affects the phosphorous atom, which is why the accuracy was only 50 per cent,” said Dzurak. “We solved the problem by removing all the silicon 29 isotopes that have magnetic spin leaving only silicon 28, which has no magnetic spin to influence the phosphorous, giving us an accuracy of 99.99 per cent.”

The second method involved turning a silicon transistor into an “artificial atom” qubit. As such, the accuracy of the data storage reached 99.6 percent. Transistors produce binary code, zeros and ones, by flowing electrons through an electronic gate that can be turned on or off.

“What we’ve done is make a silicon transistor with just one electron trapped in that transistor,” explained Dzurak. “This lets us use exactly the same sort of transistor that we use in computer chips and operate it as a qubit, opening the potential to mass-produce this technology using the same sort of equipment used for chip manufacturing.”

In addition, with the silicon chips, the two teams of scientists also improved coherence time, or the time over which a quantum system retains data.

“In solid-state systems these times are typically measured in nano or micro seconds before the information gets lost,” said Dzurak. “We are getting a 30-second coherence time, which on the time-scale of doing calculations is an eternity.”

Dzurak added that silicon is preferred in the industry because of its long coherence times while retaining accuracy.

“We can go better, these are only our first experiments,” says Dzurak.