iKnife is Capable of Sniffing Cancer During Surgery (Gizmag.com)

Gizmag.com has posted an article about a truly groundbreaking new surgical knife capable of sniffing tissue and determining if it’s cancer or benign. This breakthrough is amazing and offers a ridiculous amount of potential in other fields.

******************************

The iKnife has been used in tests in 91 operations, where it showed 100 percent accuracy when compared to conventional tests.

Dr. Zoltan Takats of the Imperial College London has developed one very sharp knife – and we’re not referring to its keen edge. The Intelligent Knife (iKnife) is equipped with a nose and a brain that can sniff out cancer as it cuts. Using a mass spectrometer to detect chemical profiles associated with tumors, it enables instant identification of cancerous tissue and helps surgeons to make sure that all of a tumor has been removed.

Cancer is obviously something you want to catch early and get rid of completely at the first opportunity. Removing tumors is the simplest and often the least harmful way, but surgeons need to ensure they’ve removed all of the cancerous tissue to prevent the disease from reestablishing itself.

Unfortunately, cancerous tissue isn’t always obvious by sight and laboratory tests are needed. During an operation, this means leaving the patient waiting under anesthetic while the tests are run. Even then, the results aren’t always reliable. According to Imperial College, one in five breast cancer patients must undergo surgery a second time.

The iKnife uses electrosurgery; a common technique developed in the 1920s designed to reduce bleeding in particularly bloody operations, such as liver resectioning. The knife is subjected to an electric current, which heats tissue so fast and at such a temperature that the knife cuts through and cauterizes the tissue to prevent bleeding. Not surprisingly, this produces a cloud of unpleasant smoke, which is sucked away.

However, this cloud also contains all sorts of useful information about the tissue being burned through, so Takats hit on the idea of hooking an electrosurgical knife to a mass spectrometer, which would analyze the smoke and produce a profile of the chemicals that make it up. Some of these chemicals or their combinations are indicative of cancerous tissue.

Once the prototype iKnife was constructed, the next step was to teach it what to look for. This involved using the device to burn tissue samples collected from 302 surgery patients and building up a library of profiles of thousands of cancerous and noncancerous tissues from various organs of the body. As the iKnife cuts through tissue, it matches what it “smells” against this library and alerts the surgeon as to what it finds in about three seconds. This is a considerable improvement over the half hour needed for conventional laboratory tests.

The iKnife has been used in tests in 91 operations, where it showed 100 percent accuracy when compared to conventional tests. According to Imperial College, the next step will be clinical trials where the surgeons will be allowed to see the results in real time instead of after the operation, as was the case in the tests.

“These results provide compelling evidence that the iKnife can be applied in a wide range of cancer surgery procedures,” Dr Takats says. “It provides a result almost instantly, allowing surgeons to carry out procedures with a level of accuracy that hasn’t been possible before. We believe it has the potential to reduce tumor recurrence rates and enable more patients to survive.”

Takats sees the iKnife as having broader applications beyond cancer surgery. Mass spectrometry is a rather general tool and Takat says that it could be used to identify tissues with inadequate blood supply, the presence of certain bacteria, and might even be of use to the local butcher in telling beef from horsemeat.

The results of the iKnife project were published in Science Translational Medicine.

The video below introduces the iKnife.

911 App Uses Smartphones to Virtually Place Dispatchers at Scene of Emergencies (MDDIonline.com)

Being employed in the telecom industry since 2000, it’s great to see practical, helpful, possibly life-saving applications available for use on smartphones. As NextGen E911 is being deployed nationally, albeit slowly in many cases, expect to see texting, picture messaging, and more diagnostic uses for smartphones in emergencies.

Would you feel calm and collected enough during an emergency to start up an app and try to use it to save someone’s life?

MDDI Online Article Here

******************************

911 App Uses Smartphones to Virtually Place Dispatchers at the Scene of Emergencies

The Android app enables 911 dispatchers to gather data such as blood pressure, heart rate, and breathing rate via a caller’s smartphone.

A team of researchers has developed a mobile medical application that harnesses smartphones to virtually place 911 dispatchers at the scene of emergency situations.

The app, developed by a team led by University of North Texas engineering professor Ram Dantu with support from the National Science Foundation’s Directorate for Computer and Information Science and Engineering, enables 911 dispatchers to remotely control the smartphone of a 911 caller at the scene, enabling the dispatcher to see video of the scene and collect vital information about the victim.

During emergency calls, 911 dispatchers ask callers basic questions to help them assess the situation, but callers don’t always know the answers.

“When a 911 operator asks the question, ‘Is the patient breathing?’ callers often have no idea,” Dantu said during a virtual press conference today.

A smartphone placed on a victim’s torso allows the emergency operator to view the victim’s breaths per minute. This allows the operator to gauge whether the caller should start CPR. Photo credit: Logan Widick, University of North Texas

The app his team created is intended to solve that problem. Using the software, a caller at the scene can place a smartphone on the victim’s chest to monitor their breathing rate and place the victim’s finger on the smartphone’s camera to check their heart rate. The app can also cufflessly monitor the victim’s blood pressure. All information captured is transmitted wirelessly to 911 dispatchers.

At the press conference, the research team also demonstrated the app’s CPR assistance feature. A 911 caller at the scene can strap a smartphone to their hands using a piece of clothing or a plastic bag, for example, to get instruction on how to perform CPR. The app can also provide real-time feedback—urging the caller to increase the speed or depth of compressions, for example.

The app also features text-to-speech technology, which can help in situations where a 911 caller doesn’t speak English or is hearing or speech impaired.

Henning Schultzrinne, of the Federal Communication Commission, said the app is one example of technology that can interface with the new Next Generation 911 systems being rolled out across the country. These IP-based systems replace the voice-only 911 systems used in the past and can incorporate new sources of information, such as text messages, images, video, and data.

The app has been tested by 40–50 individuals in a lab setting, and the researchers hope to launch a pilot in a hospital or nursing home environment soon, Dantu said. He said the app will require FDA approval, and the team’s next steps include talking with vendors of emergency dispatch protocols to learn how to integrate the app with their systems. It was initially developed for the Android platform, but the researchers also plan to launch a version that can run on Apple’s iOS. They hope to have a version of the app available for download in 2–3 months.

Contact Lens Computer: Like Google Glass, Without Glasses (TechnologyReview.com)

Bottom Line: Soft contact lenses could display information to the wearer and provide continuous medical monitoring.

******************************

WHY IT MATTERS

A computer embedded into a contact lens could make for the ultimate heads-up display.

soft contact lenses on finger
(We’ve made contact: Researchers embedded a light-emitting diode into this contact lens.)

For those who find Google Glass indiscreet, electronic contact lenses that outfit the user’s cornea with a display may one day provide an alternative. Built by researchers at several institutions, including two research arms of Samsung, the lenses use new nanomaterials to solve some of the problems that have made contact-lens displays less than practical.

A group led by Jang-Ung Park, a chemical engineer at the Ulsan National Institute of Science and Technology, mounted a light-emitting diode on an off-the-shelf soft contact lens, using a material the researchers developed: a transparent, highly conductive, and stretchy mix of graphene and silver nanowires. The researchers tested these lenses in rabbits—whose eyes are similar in size to humans’—and found no ill effects after five hours. The animals didn’t rub their eyes or grow bloodshot, and the electronics kept working. This work is described online in the journal Nano Letters.

A handful of companies and researchers have developed electronic contact lenses over the past five years. Sensimed, of Switzerland, makes a lens for 24-hour monitoring of eye pressure in glaucoma patients, and other researchers, including University of Washington professor and Google Glass project founderBabak Parviz, have built contact-lens displays. But these devices have used rigid or nontransparent materials.

Park wants to make contact lenses that have all the functions of a wearable computer but remain transparent and soft. “Our goal is to make a wearable contact-lens display that can do all the things Google Glass can do,” he says. To make it work, they needed a transparent, highly conductive material that was also flexible. The transparent conductor of choice in conventional rigid electronics, indium tin oxide, is brittle, and it must be deposited at high temperatures that would melt a contact lens. Organic conductors, graphene, and nanowires are flexible and transparent, but they’re not conductive enough.

Park, working with Sung-Woo Nam of the University of Illinois at Urbana-Champaign, found that sandwiching silver nanowires between sheets of graphene yielded a composite with much lower electrical resistance than either material alone. The industry standard for a transparent conductor is a resistance of 50 ohms per square or less, says Nam; their material has a resistance of about 33 ohms per square. The material also transmits 94 percent of visible light, and it stretches. The researchers make these conductive sheets by depositing liquid solutions of the nanomaterials on a spinning surface, such as a contact lens, at low temperatures.

Working with researchers at Samsung, they coated a contact lens with the stretchy conductor, then placed a light-emitting diode on it. Although it would be an exaggeration to call this a display, since there is just one pixel, it’s possible this kind of material will be a necessary component in future contact-lens displays, says Herbert De Smet, who works on electronic contact lenses at Ghent University in Belgium but was not involved with the work.

Nam believes medical applications of electronic contact lenses may be even more promising than eyeball-mounted displays. He is currently using the graphene-nanowire conductors to make biosensors that could monitor health conditions by sampling the chemistry of the eye’s tear film. And De Smet’s group is developing lenses that can actively filter light to compensate for vision problems.

Original TechnologyReview.com Article

%d bloggers like this: