Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought...

34
Innovations Report August 2007 _________________________ ICT Microelectr.&Nanotech Energy Environment Life Sciences Compiled by G. Sgalari – Pirelli Tyre S.p.A.

Transcript of Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought...

Page 1: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Innovations Report August 2007

_________________________

• ICT • Microelectr.&Nanotech • Energy • Environment • Life Sciences

Compiled by G. Sgalari – Pirelli Tyre S.p.A.

Page 2: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

TABLE OF CONTENTS

ICT

'Evanescent Laser' To Speed Data Transmission By Combining Laser Light With Silicon Photon-transistors for the supercomputers of the future New technology has dramatic chip-cooling potential for future computers

MICROELECTRONICS & NANOTECH

Scientists Make Flexible, Polymer-Based Data Storage Automated technique paves way for nanotechnology's industrial revolution Potato Chip Flavoring Boosts Longevity Of Concrete

ENERGY

Artificial Muscle Generates Electricity on The Sea 'Thin-layer' solar cells may bring cheaper 'green' power Arizona researchers and UOP to make algae fuel for military jets a reality MIT researchers work toward spark-free, fuel-efficient engines Berkeley Lab’s Ultraclean Combustion Technology For Electricity Generation Fires Up in Hydrogen Tests Beyond batteries: Storing power in a sheet of paper

ENVIRONEMENT

Experiment suggests limitations to carbon dioxide 'tree banking' Ceramic tubes could cut greenhouse gas emissions from power Stations Nano-boric acid makes motor oil more slippery

LIFE SCIENCES

Flip of genetic switch causes cancers in mice to self-destruct Electric fields have potential as a cancer treatment Weizmann Institute scientists discover a control mechanism for metastasis FSU chemists using light-activated molecules to kill cancer Cells New cancer weapon: nuclear nanocapsules MIT creates 3-D images of living cell

Page 3: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

ICT

Page 4: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

'Evanescent Laser' To Speed Data Transmission By Combining Laser Light With Silicon Science Daily — Researchers at UC Santa Barbara have announced they have built the world's first mode-locked silicon evanescent laser, a significant step toward combining lasers and other key optical components with the existing electronic capabilities in silicon. The research provides a way to integrate optical and electronic functions on a single chip and enables new types of integrated circuits. It introduces a more practical technology with lower cost, lower power consumption and more compact devices. The research will be reported in the September issue of Optics Express.

By causing silicon to emit light and exhibit other potentially useful optical properties, integration of photonic devices on silicon becomes possible. (Credit: Peter Allen, UC Santa Barbara College of Engineering)

Mode-locked evanescent lasers can deliver stable short pulses of laser light that are useful for many potential optical applications, including high-speed data transmission, multiple wavelength generation, remote sensing (LIDAR) and highly accurate optical clocks. Computer technology now depends mainly on silicon electronics for data transmission. By causing silicon to emit light and exhibit other potentially useful optical properties, integration of photonic devices on silicon becomes possible. The problem in the past? It is extremely difficult, nearly impossible, to create a laser in silicon.

Less than one year ago, a research team at UCSB and Intel, led by John Bowers, a professor of electrical and computer engineering, created laser light from electrical current on silicon by placing a layer of InP above the silicon. In this new study, Bowers, Brian Koch, a doctoral student, and others have used this platform to demonstrate electrically-pumped lasers emitting 40 billion pulses of light per second. This is the first ever achievement of such a rate in silicon and one that matches the rates produced by other mediums in standard use today. These short pulses are composed of many evenly spaced colors of laser light, which could be separated and each used to transmit different high-speed information, replacing the need for hundreds of lasers with just one.

Creating optical components in silicon will lead to optoelectronic devices that can increase the amount and speed of data transmission in computer chips while using existing silicon technology. Employing existing silicon technology would represent a potentially less expensive and more feasible way to mass-produce future-generation devices that would use both electrons and photons to process information, rather than just electrons as has been the case in the past.

This research builds upon the development of the first hybrid silicon laser, announced by UCSB and Intel a year ago, enabling new applications for silicon-based optics. The research was supported by funds from the Microsystems Technology Office of DARPA.

Page 5: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

University of Copenhagen

Photon-transistors for the supercomputers of the future Scientist from the Niels Bohr Institute at University of Copenhagen and from Harvard University have worked out a new theory which describe how the necessary transistors for the quantum computers of the future may be created. The research has just been published in the scientific journal Nature Physics.

Researchers dream of quantum computers. Incredibly fast super computers which can solve such extremely complicated tasks that it will revolutionise the application possibilities. But there are some serious difficulties. One of them is the transistors, which are the systems that process the signals. Today the signal is an electrical current. For a quantum computer the signal can be an optical one, and it works using a single photon which is the smallest component of light.

“To work, the photons have to meet and “talk”, and the photons very rarely interact together” says Anders Søndberg Sørensen who is a Quantum Physicist at the Niels Bohr Institute at Copenhagen University. He explains that light does not function like in Star Wars, where the people fight with light sabres and can cross swords with the light. That is pure fiction and can’t happen. When two rays of light meet and cross, the two lights go right through each other. That is called linear optics.

What he wants to do with the light is non-linear optics. That means that the photons in the light collide with each other and can affect each other. This is very difficult to do in practice. Photons are so small that one could never hit one with the other. Unless one can control them – and it is this Anders Sørensen has developed a theory about.

Instead of shooting two photons at each other from different directions and trying to get them to hit each other, he wants to use an atom as an intermediary. The atom can only absorb one photon (such are the laws of physics). If you now direct two photons towards the atom it happens that they will collide on the atom. It is exactly what he wants. The atom is however very small and difficult to hit. So the photons have to be focussed very precisely. In a previous experiment researchers had discovered that microwaves could be focussed on an atom via a superconducting nano-wire. They got the idea that the same could happen with visible light.

The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit it an interaction occurs between them, where one imparts information to the other. The information is sent in bits which are either a one or zero digit, and the order of digits produces the message. (Today we can send information via an optic cable and each bit is made up of millions of photons.) In quantum optics each bit is just one photon. The photon has now received its message and the signal continues on its way. It is a step on the way to building a photon-transistor for a quantum computer.

Provided via EurekAlert!

Two photons are sent through a nanowire towards an atom, where they collide, such that one photon (red) transfers its information to the other photon.

Page 6: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

The researchers also have developed computational models to track the flow of electrons and ions generated by the device, information needed for designing future systems using the technology. Computer chips are constantly being upgraded by creating designs with more densely packed circuits, transistors and other electronic components. The number of transistors per chip has been doubling every 18 months or so, in line with a general principle called Moore's law. As performance increases, however, so does heat generation, particularly in small hot spots. These hot spots not only hinder performance, but also could damage or destroy delicate circuitry. This means new cooling methods will be required for more powerful computers in the future. The next step in the research will be to reduce the size of components within the device from the scale of millimeters to microns, or millionths of a meter. Miniaturizing the technology will be critical to applying the method to computers and consumer electronics, allowing the device to operate at lower voltage and to cool small hot spots, Garimella said. Another challenge will be making the technology rugged enough for commercial applications. "As things get smaller, they get more delicate, so we need to strengthen all the elements. And we believe we can achieve this goal in a year or so," Garimella said.

Page 7: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Purdue University

New technology has dramatic chip-cooling potential for future computers Researchers have demonstrated a new technology using tiny "ionic wind engines" that might dramatically improve computer chip cooling, possibly addressing a looming threat to future advances in computers and electronics. The Purdue University researchers, in work funded by Intel Corp., have shown that the technology increased the "heat-transfer coefficient," which describes the cooling rate, by as much as 250 percent. "Other experimental cooling-enhancement approaches might give you a 40 percent or a 50 percent improvement," said Suresh Garimella, a professor of mechanical engineering at Purdue. "A 250 percent improvement is quite unusual."

When used in combination with a conventional fan, the experimental device enhanced the fan's effectiveness by increasing airflow to the surface of a mock computer chip. The new technology could help engineers design thinner laptop computers that run cooler than today's machines. Findings are detailed in a research paper that has been accepted for publication in the Journal of Applied Physics and is tentatively scheduled to appear in the Sept. 1 issue. The paper was authored by mechanical engineering doctoral student David Go, Garimella, associate professor of mechanical engineering Timothy Fisher and Intel research engineer Rajiv Mongia. "This technology is very exciting and innovative," Mongia said. "It has the potential of enabling imaginative notebook and handheld PC designs in the future." The new cooling technology could be introduced in computers within three years if researchers are able to miniaturize it and make the system rugged enough, Garimella said. As the technology is further developed, such cooling devices might be integrated into portable consumer electronics products, including cell phones. Advanced cooling technologies are needed to help industry meet the conflicting goals of developing more compact and lightweight computers that are still powerful enough to run high-intensity programs for video games and other graphics-laden applications. "In computers and electronics, power equals heat, so we need to find ways to manage the heat generated in more powerful laptops and handheld computers," Fisher said. The experimental cooling device, which was fabricated on top of a mock computer chip, works by generating ions - or electrically charged atoms - using electrodes placed near one another. The device contained a positively charged wire, or anode, and negatively charged electrodes, called cathodes. The anode was positioned about 10 millimeters above the cathodes. When voltage was passed through the device, the negatively charged electrodes discharged electrons toward the positively charged anode. Along the way, the electrons collided with air molecules, producing positively charged ions, which were then attracted back toward the negatively charged electrodes, creating an "ionic wind." This breeze increased the airflow on the surface of the experimental chip. Conventional cooling technologies are limited by a principle called the "no-slip" effect - as air flows over an object, the air molecules nearest the surface remain stationary. The molecules farther away from the surface move progressively faster. This phenomenon hinders computer cooling because it restricts airflow where it is most needed, directly on the chip's hot surface. The new approach potentially solves this problem by using the ionic wind effect in combination with a conventional fan to create airflow immediately adjacent to the chip's surface, Fisher said. The device was created at Purdue's Birck Nanotechnology Center in the university's Discovery Park. The researchers quantified the cooling effect with infrared imaging, which showed the technology reduced heating from about 60 degrees Celsius - or 140 degrees Fahrenheit - to about 35 degrees C, or 95 F. "We've been trying to make this work for about a year, and now we have shown that it works quite well," Garimella said. Patents are pending for the new design.

Infrared images

Page 8: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Microelectronics&

Nanotech

Page 9: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Scientists Make Flexible, Polymer-Based Data Storage The future of the electronics industry is believed by many to lie in organic materials – polymers that conduct electricity. Because they are ultra lightweight, flexible, and low-cost, they may lead to a whole new class of electronic technologies. As part of this movement, scientists recently developed a polymer-based, flexible type of data storage that displays promising information-storing characteristics. The tiny memory device, created by scientists from the National University of Singapore and described in detail in the August issue of Organic Electronics, has a three-layer structure. The middle layer, which stores the information, is a 50-nanometer-thick “copolymer,” a type of polymer made of two chemically different repeating molecule chains, rather than a single repeating chain. It is sandwiched between a 40-micrometer-thick conducting polymer substrate (the bottom electrode) and a 0.2-micrometer-thick layer of gold (the top electrode). The device's flexibility comes from its polymer substrate. Conducting polymers and other organic materials are often deposited onto rigid silicon wafers during the fabrication of memory devices and other electronics, but – although the printing methods used in these cases can be a convenient way of depositing a polymer onto a substrate – this practice physically limits the resulting devices. Unlike conventional silicon-based memory, which stores bits of information in tiny cells – recording a “0” for a fully charged cell and “1” for an uncharged cell, or vice versa – this device uses the polymer's conductive response to an applied voltage. From 0 to 4 volts, the resulting current through the middle polymer layer is very low. Above 4 volts, the current abruptly increases 100-fold. The low conductivity state is considered the “off” or “0” state and the high conductivity state is equivalent to “on” or “1.” This is how the device stores a single bit of information. The device remains in the “on” position when a negative voltage is applied across it and after power is turned off. Therefore, once information is written, it can be read many times. The device also demonstrates good thermal stability, showing signs of degradation only above 310 degrees Celsius (590 degrees Fahrenheit). The device's on-off current ratio, one parameter that indicates the quality of a memory device, is about 200, which is comparable to certain contemporary memory schemes. The researchers say that they expect this flexible polymer memory structure has the potential to meet the demands of a new generation of memory devices with unique shapes and architectures. Citation: Liang Li, Qi-Dan Ling, Siew-Lay Lim, Yoke-Ping Tan, Chunxiang Zhu, Daniel Siu Hhung Chan, En-Tang Kang, Koon-Gee Neoh, “A flexible polymer memory device” Organic Electronics 8 (2007) 401–406 Copyright 2007 PhysOrg.com.

Page 10: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Duke University

Automated technique paves way for nanotechnology's industrial revolution

In an assist in the quest for ever smaller electronic devices, Duke University engineers have adapted a decades-old computer aided design and manufacturing process to reproduce nanosize structures with features on the order of single molecules. The new automated technique for nanomanufacturing suggests that the emerging nanotechnology industry might capitalize on skills already mastered by today's engineering workforce, according to the researchers. "These tools allow you to go from basic, one-off scientific demonstrations of what can be done at the nanoscale to repetitively engineering surface features at the nanoscale," said Rob Clark, Thomas Lord Professor and chair of the mechanical engineering and materials science department at Duke University's Pratt School of Engineering.

The feat was accomplished by using the traditional computing language of macroscale milling machines to guide an atomic force microscope (AFM). The system reliably produced 3-D, nanometer-scale silicon oxide nanostructures through a process called anodization nanolithography, in which oxides are built on semiconducting and metallic surfaces by applying an electric field in the presence of tiny amounts of water. "That's the key to moving from basic science to industrial automation," Clark said. "When you manufacture, it doesn't matter if you can do it once, the question is: Can you do it 100 million times and what's the variability over those 100 million times" Is it consistent enough that you can actually put it into a process" Clark and Matthew Johannes report their findings in the August 29 issue of the journal Nanotechnology and expect to make their software and designs freely available online. The work was supported by the National Science Foundation.

Atomic force microscopes (AFMs), which can both produce images and manipulate individual atoms and molecules, have been the instrument of choice for researchers creating localized, two-dimensional patterns on metals and semiconductors at the nanoscale. Yet those nanopatterning systems have relied on the discrete points of a two-dimensional image for laying out the design. "Now we've added another dimension," Johannes said. The researchers showed they could visualize 3-D structures—including a series of squares that differed in size, and a star--in a computerized design environment and then automatically build them at the nanoscale. The structures they produced were measured in nanometers—one billionth of a meter—about 80,000 times smaller than the diameter of a human hair. Johannes had to learn to carefully control the process by adjustments to the humidity, voltage, and scanning speed, relying on sensors to guide the otherwise invisible process.

The new technique suggests that the nanotechnology factories of the future might not operate so differently from existing manufacturing plants. "If you can take prototyping and nanomanufacturing to a level that leverages what engineers know how to do, then you are ahead of the game," Clark said. "Most engineers with conventional training don't think about nanoscale manipulation. But if you want to leverage a workforce that's already in place, how do you set up the future of manufacturing in a language that engineers already use to communicate" That's what we're focused on doing here."

Provided via EurekAlert!

Page 11: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

American Chemical Society

Potato Chip Flavoring Boosts Longevity Of Concrete The ingredient that helps give "salt & vinegar" potato chips that tangy snap is the key to a new waterproof coating for protecting concrete from water damage, according to a study scheduled for the August 1 issue of ACS' Industrial & Engineering Chemistry Research.

Awni Al-Otoom and colleagues in Jordan point out that concrete's unique properties have made it the world's most widely used structural material.

Concrete, however, is so porous that water soaks in, corroding steel reinforcing bars and meshes that strengthen concrete roads and buildings and causing cracks as water expands and contracts during freeze-thaw cycles. Sealants are commercially available, but they have serious shortcomings, the study notes.

In the new report, researchers describe the use of sodium acetate as an inexpensive and environmentally friendly concrete sealant. One of sodium acetate's many uses is in flavored potato chips.

In laboratory studies using freshly made concrete, the researchers showed that sodium acetate seeps into pores in concrete and then hardens and crystallizes upon exposure to water. The resultant swelling blocks entry of additional moisture, they said. Under dry conditions, the crystals shrink back to their original size and allow moisture to evaporate.

The net result is "a significant reduction in water permeability," that "can be expected to increase the service life of the concrete," the report said.

Article: "Crystallization Technology for Reducing Water Permeability into Concrete"

Provided by Science Daily

Page 12: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Energy

Page 13: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Artificial Muscle Generates Electricity on The Sea

The buoy for wave-powered generation floating off the coast of Florida. EPAM units are mounted in the center (photo: courtesy of Hyper Drive and SRI International).

The black portions are cylindrical EPAMs (photo: courtesy of Hyper Drive and SRI International)

Hyper Drive Co. Ltd. launched a field test of wave-powered generator off the coast of Florida. Hyper Drive is a Japanese

venture company focused on the commercialization of wave-powered generator using electroactive polymer artificial

muscle (EPAM). The development of the generator was commissioned to SRI International, a U.S.-based nonprofit

research organization and is currently under development. SRI International conducted the test for approximately two

weeks from Aug. 1 to 15 in Tampa, Fla. The organization deployed a prototype navigation buoy in the ocean 2 km off the

coast and recorded data on the output energy per second, wave height, etc. The organization is currently analyzing the

data obtained, the results of which are expected to be ready in mid-September 2007.

The buoy was mounted with four cylindrical EPAM units, each measuring about 30 cm in diameter and about 20 cm in

height. A weight attached to each unit is moved up and down by the ocean waves, causing the EPAM to expand and

contract and thereby generating electric power. According to SRI International, the total output is presently about 5 Wh.

Hyper Drive intends to improve the output and durability based on the analysis data. The company plans to prototype a

new buoy with increased output and durability within one year and conduct a field test for three months using it. If the

field test produces favorable data, the company will take specific actions toward commercialization. At present, several

Japanese companies including material manufacturers are reportedly interested in this generator. Hyper Drive is

considering the commercialization of another type of generator that combines wave-powered generation and solar cell so

that disadvantages of one method can be compensated for by the other. While the output of wave-powered generation

decreases in good weather because the sea is calm, the output of solar cell increases and vice versa, in bad weather.

Page 14: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Durham University

'Thin-layer' solar cells may bring cheaper 'green' power

Scientists are researching new ways of harnessing the sun’s rays which could eventually make it cheaper for people to use solar energy to power their homes. The experts at Durham University are developing light-absorbing materials for use in the production of thin-layer solar photovoltaic (PV) cells which are used to convert light energy into electricity. The four-year project involves experiments on a range of different materials that would be less expensive and more sustainable to use in the manufacturing of solar panels. Thicker silicon-based cells and compounds containing indium, a rare and expensive metal, are more commonly used to make solar panels today. The research, funded by the Engineering and Physical Sciences Research Council (EPSRC) SUPERGEN Initiative, focuses on developing thin-layer PV cells using materials such as copper indium diselenide and cadmium telluride.

Right now the project is entering a new phase for the development of cheaper and more sustainable variants of these materials. The Durham team is also working on manipulating the growth of the materials so they form a continuous structure which is essential for conducting the energy trapped by solar panels before it is turned into usable electricity. This will help improve the efficiency of the thin-layer PV cells. It’s hoped that the development of more affordable thin-film PV cells could lead to a reduction in the cost of solar panels for the domestic market and an increase in the use of solar power.

The thin-layer PV cells would be used to make solar panels that could be fitted to roofs to help power homes with any surplus electricity being fed back to The National Grid. This could lead to cheaper fuel bills and less reliance on burning fossil fuels as a way of helping to generate electricity. Professor Ken Durose, Director of the Durham Centre for Renewable Energy, who is leading the research, said: “One of the main issues in solar energy is the cost of materials and we recognise that the cost of solar cells is slowing down their uptake. “If solar panels were cheap enough so you could buy a system off the shelf that provided even a fraction of your power needs you would do it, but that product isn’t there at the moment.

“The key indicator of cost effectiveness is how many pounds do you have to spend to get a watt of power out” “If you can make solar panels more cheaply then you will have a winning product.”

Provided via EurekAlert!

Page 15: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Arizona researchers and UOP to make algae fuel for military jets a reality Arizona State University researchers are part of a team led by UOP, LLC., a Honeywell company, that is looking at alternative sources of oil that could be used to produce Jet Propellant 8 (JP-8) or military jet fuel. The goal of the project, which is backed by a $6.7 million award from the Defense Advanced Research Projects Agency (DARPA), is to develop and commercialize a process to produce Jet Propellant (JP-8), which is used by U. S. and NATO militaries. The ASU team in the School of Applied Arts and Sciences will lead an effort to demonstrate the technical and economic feasibility of using algae as an alternative feedstock resource. ASU’s researchers Qiang Hu and Milton Sommerfeld will screen for oil-rich algal strains, evaluate their potential as oil producers and develop an algal feedstock production system that will yield competitively priced oil that can be converted into jet fuel. Hu and Sommerfeld, who direct the Laboratory for Algae Research and Biotechnology, have focused on algae as a source of renewable oil for more than 20 years. The benefits of oil produced from algae are endless, according to the ASU researchers. “Algae are non-food/feed sources, so there is no inherent conflict of using food crop plants for fuel rather than for food,” said Hu and Sommerfeld. “Also, algae can be grown on land that is unsuitable for agriculture and can use saline or brackish water, making the algae feedstock production system complementary rather than competitive to existing agriculture. “Moreover, since algae can use carbon dioxide from waste or flue gases as a nutrient for growth, an added value of algae feedstock production is environmental carbon sequestration.” While algal oil is very similar to other vegetable oils in terms of fatty acid composition, the oil yield of algae is projected to be at least 100 times that of soybean per acre of land on an annual basis. ASU, UOP LLC, Honeywell Aerospace, Southwest Research Institute and Sandia National Laboratories researchers will be working to help develop and commercialize a process to produce jet fuel that is vegetable and/or algal oil based rather than petroleum based. “We are confident that we have assembled a strong team of experts that will be successful in proving the viability of biofeedstock technologies for JP-8 and other jet fuels, while offering the U.S. military another option for sustainable liquid fuels critical to their programs,” said Jennifer Holmgren, director of UOP’s Renewable Energy and Chemicals business unit. Fuel produced by the new process will have to meet stringent military specifications and is expected to achieve 90 percent energy efficiency for maximum conversion of feed to fuel, to reduce waste and to reduce production costs. UOP expects the technology will be viable for future use in the production of fuel for commercial jets. The project is expected to be completed by the end of 2008.

Page 16: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Massachusetts Institute of Technology

MIT researchers work toward spark-free, fuel-efficient engines

In an advance that could help curb global demand for oil, MIT researchers have demonstrated how ordinary spark-ignition automobile engines can, under certain driving conditions, move into a spark-free operating mode that is more fuel-efficient and just as clean. The mode-switching capability could appear in production models within a few years, improving fuel economy by several miles per gallon in millions of new cars each year. Over time, that change could cut oil demand in the United States alone by a million barrels a day. Currently, the U.S. consumes more than 20 million barrels of oil a day. The MIT team presented their latest results at the Japan Society of Automotive Engineers (JSAE)/Society of Automotive Engineers (SAE) 2007 International Fuel and Lubricants Meeting.

Many researchers are studying a new way of operating an internal combustion engine known as “homogeneous charge compression ignition” (HCCI). Switching a spark-ignition (SI) engine to HCCI mode pushes up its fuel efficiency. In an HCCI engine, fuel and air are mixed together and injected into the cylinder. The piston compresses the mixture until spontaneous combustion occurs. The engine thus combines fuel-and-air premixing (as in an SI engine) with spontaneous ignition (as in a diesel engine). The result is the HCCI's distinctive feature: combustion occurs simultaneously at many locations throughout the combustion chamber. That behavior has advantages. In both SI and diesel engines, the fuel must burn hot to ensure that the flame spreads rapidly through the combustion chamber before a new “charge” enters. In an HCCI engine, there is no need for a quickly spreading flame because combustion occurs throughout the combustion chamber. As a result, combustion temperatures can be lower, so emissions of nitrogen pollutants are negligible. The fuel is spread in low concentrations throughout the cylinder, so the soot emissions from fuel-rich regions in diesels are not present. Perhaps most important, the HCCI engine is not locked into having just enough air to burn the available fuel, as is the SI engine. When the fuel coming into an SI engine is reduced to cut power, the incoming air must also be constrained-a major source of wasted energy. mode (assuming a constant fuel).

Image Courtesy of MIT

However, it is difficult to control exactly when ignition occurs in an HCCI engine. And if it does not begin when the piston is positioned for the power stroke, the engine will not run right. “It's like when you push a kid on a swing,” said Professor William H. Green, Jr., of the Department of Chemical Engineering. “You have to push when the swing is all the way back and about to go. If you push at the wrong time, the kid will twist around and not go anywhere. The same thing happens to your engine.” According to Green, ignition timing in an HCCI engine depends

Page 17: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

on two factors: the temperature of the mixture and the detailed chemistry of the fuel. Both are hard to predict and control. So while the HCCI engine performs well under controlled conditions in the laboratory, it is difficult to predict at this time what will happen in the real world. Green, along with Professor Wai K. Cheng of the Department of Mechanical Engineering, and colleagues in MIT's Sloan Automotive Laboratory and MIT's Laboratory for Energy and the Environment have been working to find the answer. A large part of their research has utilized an engine modified to run in either HCCI or SI operating mode. For the past two years, the group has been studying the engine's behavior as the inlet temperature and type of fuel are changed. Not surprisingly, the range of conditions suitable for HCCI operation is far smaller than the range for SI mode. Variations in temperature had a noticeable but not overwhelming effect on when the HCCI mode worked. Fuel composition had a greater impact, but it was not as much of a showstopper as the researchers expected. Using the results of their engine tests as a guide, the researchers developed an inexpensive technique that should enable a single engine to run in SI mode but switch to HCCI mode whenever possible. A simple temperature sensor determines whether the upcoming cycle should be in SI or HCCI To estimate potential fuel savings from the mode-switching scheme, Andreae determined when an SI engine would switch into HCCI mode under simulated urban driving conditions. Over the course of the simulated trip, HCCI mode operates about 40 percent of the time.

The researchers estimate that the increase in fuel efficiency would be a few miles per gallon. “That may not seem like an impressive improvement,” said Green. “But if all the cars in the US today improved that much, it might be worth a million barrels of oil per day-and that's a lot.”

Provided via EurekAlert!

Page 18: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Berkeley Lab’s Ultraclean Combustion Technology For Electricity Generation Fires Up in Hydrogen Tests

An experimental gas turbine simulator equipped with an ultralow-emissions combustion technology called LSI has been tested successfully using pure hydrogen as a fuel – a milestone that indicates a potential to help eliminate millions of tons of carbon dioxide and thousands of tons of NOx from power plants each year. The LSI (low-swirl injector) technology, developed by Robert Cheng of the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, recently won a 2007 R&D 100 award from R&D magazine

as one of the top 100 new technologies of the year.

The LSI holds great promise for its near-zero emissions of nitrogen oxides, gases that are emitted during the combustion of fuels such as natural gas during the production of electricity. Nitrogen oxides, or NOx, are greenhouse gases as well as components of smog.

The Department of Energy’s Office of Electricity Delivery and Energy Reliability initially funded the development of the LSI for use in industrial gas turbines for on-site (i.e. distributed) electricity production. The purpose of this research was to develop a natural gas-burning turbine using the LSI’s ability to substantially reduce NOx emissions.

Cheng, Berkeley Lab colleague David Littlejohn, and Kenneth Smith and

Wazeem Nazeer from Solar Turbines Inc. of San Diego adapted the low-swirl injector technology to the Taurus 70 gas turbine that produces about seven megawatts of electricity.

A cutaway of the Taurus 70

A prototype of the low-swirl injector. Fuel flows through the openings of the center channel. This simple design creates the low-swirl flow, with lower emissions of NOx the result.

Page 19: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

The team’s effort garnered them the R&D 100 honor. It is continuing the LSI development for carbon-neutral renewable fuels available from landfills and other industrial processes such as petroleum refining and waste treatments. “This is a kind of rocket science,” says Cheng, who notes that these turbines, which are being used to produce electricity by burning gaseous fuels, are similar in operating principle to turbines that propel jet airplanes. DOE’s Office of Fossil Energy is funding another project in which the LSI is being tested for its ability to burn syngas (a mixture of hydrogen and carbon monoxide) and hydrogen fuels in an advanced IGCC plant (Integrated Gasification Combined Cycle) called FutureGen, which is planned to be the world’s first near-zero-emissions coal power plant. The intention of the FutureGen plant is to produce hydrogen from gasification of coal and sequester the carbon dioxide generated by the process. The LSI is one of several combustion technologies being evaluated for use in the 200+- megawatt utility-size hydrogen turbine that is a key component of the FutureGen plant. The collaboration between Berkeley Lab and the National Energy Technology Laboratory (NETL) in Morgantown, WV, recently achieved the milestone of successfully test-firing anLSI unit using pure hydrogen as its fuel. Because the LSI is a simple and cost-effective technology that can burn a variety of fuels, it has the potential to help eliminate millions of tons of carbon dioxide and thousands of tons of NOx from power plants each year.

How the LSI works The low swirl injector is a mechanically simple device with no moving parts that imparts a mild spin to the gaseous fuel and air mixture that causes the mixture to spread out. The flame is stabilized within the spreading flow just beyond the exit of the burner. Not only is the flame stable, but it also burns at a lower temperature than that of conventional burners. The production of nitrogen oxides is highly temperature-dependent, and the lower temperature of the flame reduces emissions of nitrogen oxides to very low levels.

“The LSI principle defies conventional approaches,” says Cheng. “Combustion experts worldwide are just beginning to embrace this counter-intuitive idea. Principles from turbulent fluid mechanics, thermodynamics, and flame chemistry are all required to explain the science underlying this combustion phenomenon.”

Natural gas-burning turbines with the low-swirl injector emit an order of magnitude lower levels of NOx than conventional turbines. Tests at Berkeley Lab and Solar Turbines showed that the burners with the LSI emit 2 parts per million of NOx (corrected to 15% oxygen), more than

five times times less than conventional burners. A more significant benefit of the LSI technology is its ability to burn a variety of different fuels from natural gas to hydrogen and the relative ease to incorporate it into current gas turbine design — extensive redesign of the turbine is not needed. The LSI is being designed as a drop-in component for gas-burning turbine power plants.

Source: Berkeley Lab

Robert Cheng views an LSI flame. He is touching the burner, demonstrating that it stays cool because the flame is completely lifted from its body.

Page 20: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Rensselaer Polytechnic Institute

Beyond batteries: Storing power in a sheet of paper

Researchers turn everyday paper into resilient, rechargeable energy storage device

Researchers at Rensselaer Polytechnic Institute have developed a new energy storage device that easily could be mistaken for a simple sheet of black paper. The nanoengineered battery is lightweight, ultra thin, completely flexible, and geared toward meeting the trickiest design and energy requirements of tomorrow’s gadgets, implantable medical equipment, and transportation vehicles. Along with its ability to function in temperatures up to 300 degrees Fahrenheit and down to 100 below zero, the device is completely integrated and can be printed like paper. The device is also unique in that it can function as both a high-energy battery and a high-power supercapacitor, which are generally separate components in most electrical systems. Another key feature is the capability to use human blood or sweat to help power the battery. Details of the project are outlined in the paper “Flexible Energy Storage Devices Based on Nanocomposite Paper” published Aug. 13 in the Proceedings of the National Academy of Sciences.

The semblance to paper is no accident: more than 90 percent of the device is made up of cellulose, the same plant cells used in newsprint, loose leaf, lunch bags, and nearly every other type of paper. Rensselaer researchers infused this paper with aligned carbon nanotubes, which give the device its black color. The nanotubes act as electrodes and allow the storage devices to conduct electricity. The device, engineered to function as both a lithium-ion battery and a supercapacitor, can provide the long, steady power output comparable to a conventional battery, as well as a supercapacitor’s quick burst of high energy. The device can be rolled, twisted, folded, or cut into any number of shapes with no loss of mechanical integrity or efficiency. The paper batteries can also be stacked, like a ream of printer paper, to boost the total power output. “It’s essentially a regular piece of paper, but it’s made in a very intelligent way,” said paper co-author Robert Linhardt, the Ann and John H. Broadbent Senior Constellation Professor of Biocatalysis and Metabolic Engineering at Rensselaer. “We’re not putting pieces together – it’s a single, integrated device,” he said. “The components are molecularly attached to each other: the carbon nanotube print is embedded in the paper, and the electrolyte is soaked into the paper. The end result is a device that looks, feels, and weighs the same as paper.”

The creation of this unique nanocomposite paper drew from a diverse pool of disciplines, requiring expertise in materials science, energy storage, and chemistry. Along with Linhardt, authors of the paper include Pulickel M. Ajayan, professor of materials science and engineering, and Omkaram Nalamasu, professor of chemistry with a joint appointment in materials science and engineering. Senior research specialist Victor Pushparaj, along with postdoctoral research associates Shaijumon M. Manikoth, Ashavani Kumar, and Saravanababu Murugesan, were co-authors and lead researchers of the project. Other co-authors include research associate Lijie Ci and Rensselaer Nanotechnology Center Laboratory Manager Robert Vajtai. The researchers used ionic liquid, essentially a liquid salt, as the battery’s electrolyte. It’s important to note that ionic liquid contains no water, which means there’s nothing in the batteries to freeze or evaporate. “This lack of water allows the paper energy storage devices to withstand extreme temperatures,” Kumar said. Along with use in small handheld electronics, the paper batteries’ light weight could make them ideal for use in automobiles, aircraft, and even boats. The paper also could be molded into different shapes, such as a car door, which would enable important new engineering innovations.

A sample of the new nanocomposite paper developed by researchers at Rensselaer Polytechnic Institute.

Infused with carbon nanotubes, the paper can be used to create ultra-thin,

flexible batteries

Page 21: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

“Plus, because of the high paper content and lack of toxic chemicals, it’s environmentally safe,” Shaijumon said. Paper is also extremely biocompatible and these new hybrid battery/supercapcitors have potential as power supplies for devices implanted in the body. The team printed paper batteries without adding any electrolytes, and demonstrated that naturally occurring electrolytes in human sweat, blood, and urine can be used to activate the battery device. “It’s a way to power a small device such as a pacemaker without introducing any harsh chemicals – such as the kind that are typically found in batteries – into the body,” Pushparaj said. The materials required to create the paper batteries are inexpensive, Murugesan said, but the team has not yet developed a way to inexpensively mass produce the devices. The end goal is to print the paper using a roll-to-roll system similar to how newspapers are printed. “When we get this technology down, we’ll basically have the ability to print batteries and print supercapacitors,” Ajayan said. “We see this as a technology that’s just right for the current energy market, as well as the electronics industry, which is always looking for smaller, lighter power sources. Our device could make its way into any number of different applications.”

The team of researchers has already filed a patent protecting the invention. They are now working on ways to boost the efficiency of the batteries and supercapacitors, and investigating different manufacturing techniques. "Energy storage is an area that can be addressed by nanomanufacturing technologies and our truly inter-disciplinary collaborative activity that brings together advances and expertise in nanotechnology, room-temperature ionic liquids, and energy storage devices in a creative way to devise novel battery and supercapacitor devices," Nalamasu said.

Provided via EurekAlert!

Page 22: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Environment

Page 23: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Duke University

Experiment suggests limitations to carbon dioxide 'tree banking'

While 10 years of bathing North Carolina pine tree stands with extra carbon dioxide did allow the trees to grow more tissue, only those pines receiving the most water and nutrients were able to store significant amounts of carbon that could offset the effects of global warming, scientists told a national meeting of the Ecological Society of America (ESA). These results from the decade-long Free Air Carbon Enrichment (FACE) experiment in a Duke University forest suggest that proposals to bank extra CO2 from human activities in such trees may depend on the vagaries of the weather and large scale forest fertilization efforts, said Ram Oren, the FACE project director.

"If water availability decreases to plants at the same time that carbon dioxide increases, then we might not have a net gain in carbon sequestration," said Oren, a professor of ecology at Duke's Nicholas School of the Environment and Earth Sciences. "In order to actually have an effect on the atmospheric concentration of CO2, the results suggest a future need to fertilize vast areas," Oren added. "And the impact on water quality of fertilizing large areas will be intolerable to society. Water is already a scarce resource. "In a presentation delivered by Heather McCarthy, Oren's former graduate student, eight scientists working at the FACE site reported on the daily administrations of 1 1/2 times today's CO2 levels and how it has changed carbon accumulations in plants growing there.

The Department of Energy-funded FACE site consists of four forest plots receiving extra CO2 from computer-controlled valves mounted on rings of towers, and four other matched plots receiving no extra gas. Trees in the loblolly pine-dominated forest plots that were treated produced about 20 percent more biomass on average, the researchers found. But since the amounts of available water and nitrogen nutrients varied substantially from plot to plot, using averages could be misleading. "In some areas, the growth is maybe 5 or 10 percent more, and in other areas it's 40 percent more," Oren said. "So in sites that are poor in nutrients and water we see very little response. In sites that are rich in both we see a large response."

The researchers found that extra carbon dioxide had no effect on what foresters call "self thinning" -- the tendency of less-successful trees to die off as the most-successful grow bigger."We didn't find that elevated CO2 caused any deviation from this standard relationship," said McCarthy, now a postdoctoral fellow at the University of California, Irvine. Also unchanged by the CO2 enrichment were the proportions of carbon atoms that found their way to various components of plant systems -- wood, leaves, roots and underlying soil. Only a few of those components will store carbon over time, noted Oren and McCarthy. "Carbon that's in foliage is going to last a lot shorter time than carbon in the wood, because leaves quickly decay," McCarthy said. "So elevated CO2 could significantly increase the production of foliage but this would lead to only a very small increase in ecosystem carbon storage."

Provided via EurekAlert!

Page 24: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

University of Newcastle upon Tyne

Ceramic tubes could cut greenhouse gas emissions from power stations Greenhouse gas emissions from power stations could be cut to almost zero by controlling the combustion process with tiny tubes made from an advanced ceramic material, claim engineers today. The material, known as LSCF, has the remarkable property of being able to filter oxygen out of the air. By burning fuel in pure oxygen, it is possible to produce a stream of almost pure carbon dioxide, which has commercial potential for reprocessing into useful chemicals. LSCF is not a brand new material - it was originally developed for fuel cell technology - but engineers at Newcastle University, in collaboration with Imperial College London, have developed it for potential use in reducing emissions for gas-fired power stations and possibly coal and oil-fired electricity generation as well. Conventional gas-fired power stations burn methane in a stream of air, producing a mixture of nitrogen and greenhouse gases including carbon dioxide and nitrogen oxides, which are emitted into the atmosphere. Separating the gases is not practical because of the high cost and large amount of energy needed to do so. However, the LSCF tubes would allow only the oxygen component of air to reach the methane gas, resulting in the production of almost pure carbon dioxide and steam, which can easily be separated by condensing out the steam as water. The resulting stream of carbon dioxide could be piped to a processing plant for conversion into chemicals such as methanol, a useful industrial fuel and solvent. The new combustion process has been developed and tested in the laboratory by Professor Ian Metcalfe, Dr Alan Thursfield and colleagues in the School of Chemical Engineering and Advanced Materials at Newcastle University, in collaboration with Dr Kang Li in the Chemical Engineering Department at Imperial College London. The research has been funded by the Engineering and Physical Sciences Research Council (EPSRC). Details of the research and development project are published on 3 August 2007 simultaneously in two technical publications - Materials World and The Chemical Engineer. The LSCF tubes look like small, stiff, drinking straws and are permeable to oxygen ions — individual atoms carrying an electrical charge. Crucially, LSCF is also resistant to corrosion or decomposition at typical power station operating temperatures of around 800C. When air is blown around the outside of the tubes, oxygen is able to pass through the wall of the tube to the inside, where it combusts with methane gas that is being pumped through the centre of the tubes. The oxygen-depleted air, which consists mainly of nitrogen, can be returned to the atmosphere with no harmful effects on the environment, while the carbon dioxide can be collected separately from the inside of the tubes after combustion. An alternative would be to control the flow of air and methane so that only partial combustion took place. This would result in a flow of 'synthesis gas', a mixture of carbon monoxide and hydrogen, which can easily be converted into a variety of useful hydrocarbon chemicals. The tubes of LSCF, which stands for Lanthanum-Strontium-Cobalt-Ferric Oxide, have been tested successfully in the laboratory and the design is attracting interest from the energy industry. The Newcastle team is now carrying out further tests on the durability of the tubes to confirm their initial findings that they could withstand the conditions inside a power station combustion chamber for a reasonable length of time. Although it has not yet been attempted, it should be possible to assemble a power station combustion chamber from a large number of the tubes, with space between them for air to circulate. In theory the technology could also be applied to coal and oil-fired power stations, provided that the solid and liquid fuels were first converted into gas. This operation is simple in theory but would add to the cost and complexity of running a power station. Government statistics suggest that the UK energy industry produces over 200 million tonnes of carbon dioxide per year, which is more than one-third of the country's total carbon dioxide emissions. Professor Metcalfe said: 'The cheapest way to dispose of waste carbon dioxide from combustion is to release it into the atmosphere. We have been doing this since humans first discovered how to make fire.' 'The technology we have developed may provide a viable alternative, although whether it is economical to introduce it will depend largely upon the carbon credit system that Governments operate in the future.'

Provided via EurekAlert!

Page 25: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Nano-boric acid makes motor oil more slippery One key to saving the environment, improving our economy and reducing our dependence on foreign oil might just be sitting in the medicine cabinet. Scientists at the U.S. Department of Energy's Argonne National Laboratory have begun to combine infinitesimal particles of boric acid — known primarily as a mild antiseptic and eye cleanser — with traditional motor oils in order to improve their lubricity and by doing so increase energy efficiency. Ali Erdemir, senior scientist in Argonne's Energy Systems Division, has spent nearly 20 years investigating the lubricious properties of boric acid. In 1991, he received an R&D 100 award — widely considered the "Oscar of technology" — for showing that microscopic particles of boric acid could dramatically reduce friction between automobile engine parts. Metals covered with a boric acid film exhibited coefficients of friction lower than that of Teflon, making Erdemir's films the slickest solids in existence at that time. "Ali was looking at large, micron-sized, particles," said George Fenske, who works alongside Erdemir at Argonne. "He was just sprinkling boric acid onto surfaces." But driven by a conviction that he could fashion boric acid into an even better lubricant, Erdemir continued to chase the ultimate frontier: a perfectly frictionless material. Glimpsing the potential of nanotechnology, Erdemir went smaller — 10 times smaller — and was astonished by the behavior of much thinner boric acid films. "If you can produce or manufacture boric acid at the nanoscale, its properties become even more fantastic," he said. Reducing the size of the particles to as tiny as 50 nanometers in diameter — less than one-thousandth the width of a human hair — solved a number of old problems and opened up a number of new possibilities, Erdemir said. In previous tests, his team had combined the larger boric acid particles with pure poly-alpha-olefin, the principal ingredient in many synthetic motor oils. While these larger particles dramatically improved the lubricity of the pure oil, within a few weeks gravity had started to separate the mixture. By using smaller particles, Erdemir created a stable suspension of boric acid in the motor oil. In laboratory tests, these new boric acid suspensions have reduced by as much as two-thirds the energy lost through friction as heat. The implications for fuel economy are not hard to imagine, Erdemir said. "You're easily talking about a four or five percent reduction in fuel consumption," he said. "In a given day, we consume so many millions of barrels of oil, and if you can reduce that number by even one percent, that will have a huge economic impact." Argonne is currently in talks with materials and lubricant manufacturers to bring boric acid technology to market, Erdemir said. While these new additives need to pass a battery of environmental and safety tests, they will probably be available within two years. In his first experiments with boric acid, Erdemir demonstrated that the compound not only proved an effective lubricant but was also every industrial technologist's dream: It came from naturally abundant minerals, was cheap to manufacture, and posed no health hazards or environmental threats. Boric acid owes its lubricious properties to its unique natural structure. The compound consists of a stack of crystallized layers in which the atoms tightly adhere to each other. However, these layers stack themselves relatively far apart, so that the intermolecular bonds — called van der Waals forces — are comparatively weak. When stressed, the compound's layers smear and slide over one another easily, like a strewn deck of playing cards. The strong bonding within each layer prevents direct contact between sliding parts, lowering friction and minimizing wear.

Page 26: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Until recently, most of Erdemir's work in boric acid lubrication had been restricted to motor oils, principally because of the relative bulk of the larger particles. The move to the nanoscale, however, has opened up other possible uses of the chemical. Through a simple chemical reaction, nano-boric acid can be transformed into a liquid relative of boric acid that has shown potential to increase fuel lubricity. Using this liquid analog of solid boric acid as a fuel additive on a large scale could greatly benefit the environment, both because it would help to increase fuel efficiency and because it would replace existing fuel lubricants that are potentially harmful to the environment, Erdemir said. By themselves, most fuels — especially diesels — contain some sulfur and other special chemical additives to boost lubricity. When burned, however, some of these additives along with sulfur may cause harmful emissions and acid rain. However, the lack of a suitable alternative complicates efforts to cut sulfur content. The substitution of liquid boric acid for sulfur-containing additives preserves the health of the car as well as that of the environment. Sulfur exhaust gradually coats the surface of a car's catalytic converter, the part that helps to reduce the toxicity of a car's emissions. Eventually, the converter becomes so choked with sulfur that it is no longer able to process any more exhaust. Even though he has just begun to unleash the potential of boric acid, Erdemir believes that nanoscale synthetic compounds may prove to be even more effective lubricants. "The next step is to use the basic knowledge that we have gained out of this particular compound to come up with more exotic compounds that will work even better," he said

Page 27: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Life Sciences

Page 28: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Stanford University Medical Center

Flip of genetic switch causes cancers in mice to self-destruct

Killing cancerous tumors isn't easy, as anyone who has suffered through chemotherapy can attest. But a new study in mice shows that switching off a single malfunctioning gene can halt the limitless division of tumor cells and turn them back to the path of their own planned obsolescence. The surprising possibility that a cell's own natural mechanism for ensuring its mortality could be used to vanquish tumors opens the door to a new approach to developing drugs to treat cancer patients, according to Dean Felsher, MD, PhD, at the Stanford University School of Medicine. Felsher is the senior author of the study published in the July 30 online version of the Proceedings of the National Academy of Sciences. "Our research implies that by shutting off a critical cancer gene, tumor cells can realize that they are broken and restore this physiologic fail-safe program," said Felsher.

Cancer can be notoriously resistant to medical treatment. Not only do cancer cells proliferate uncontrollably, they somehow circumvent the mechanism that causes normal cells to die when they get old or malfunction. That makes cancer cells effectively immortal unless doctors manage to squelch them. The gene Felsher's team studied produces a protein called Myc which promotes cell division. A mutation of the gene causes cells to overproduce the protein, prompting perpetual cell division and tumor growth. By turning off the mutated gene, the researchers found that not only did uncontrolled cell division cease, but the cells also reactivated a normal physiological mechanism, called senescence, which makes it possible for a cell to eventually die. "What was unexpected was just the fact that cancer cells had retained the ability to undergo senescence at all," said Felsher. Cancer researchers had long thought the senescence process had to be irreversibly disrupted for a tumor to develop. The researchers worked with a series of mice engineered to have Myc-triggered cancers of the liver, blood or bones, along with a specially constructed version of the Myc gene that they could switch off by feeding the mice antibiotics. When the mice dined on doses of the drugs, invariably, the tumors ceased growing and then diminished, with some disappearing over the course of just a few days. Although Felsher's lab had previously shown that mouse tumors diminished and disappeared when Myc was switched off, they hadn't been sure how the process actually worked. Historically, most research involving genetic methods of battling cancer cells has focused on reactivating genes called tumor-suppressor genes, which are generally overcome by a proliferating cancer. No one had explored the idea that senescence might play a key role in diminishing tumors. Felsher described senescence as acting like a fail-safe mechanism to stop cancer. When a cell detects a deleterious mutation, it launches the senescence process, resulting in the permanent loss of the cell's ability to proliferate, thus halting any cancer. "In order to become tumor cells, those cells have to overcome senescence," said Chi-Hwa Wu, postdoctoral researcher in Felsher's lab and first author of the study. Wu had the inspiration to explore whether the sudden diminishment they had observed in the tumors might be due to the reactivation of some latent remnant of the trigger for senescence. Through a series of experiments looking at enzymes associated with the senescence process, as well as some molecular markers, Wu confirmed her suspicion. And not only was senescence occurring in cells that had been thought to be incapable of it, the process was reactivated in all the different tumors they studied. Consider it a cell version of the Jekyll-and-Hyde transformation. "It's sort of like Mr. Hyde realizing that there's something wrong with him and then being able to put himself back into his normal state," Felsher said. In addition to the deepened understanding of how the process of senescence works, Felsher and Wu see a lot of potential for new approaches to treating cancer, beyond the traditional tactic of trying to kill cancer cells directly. "This work implies that maybe part of the strategy should involve figuring out how to get the cancer cells to just be allowed to do what they originally wanted to do anyway, which is to not be proliferating endlessly and growing uncontrolled". The next step for the team is to see how well the approach works in human cancer cells. "And we're also trying to figure out what the mechanism is," Felsher said. "What are the molecular mechanisms of this, so that we can figure out how to better treat cancer"

Provided via EurekAlert!

Page 29: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

American Institute of Physics

Electric fields have potential as a cancer treatment

Experiments slow cancer cell division, brain tumor progression Low-intensity electric fields can disrupt the division of cancer cells and slow the growth of brain tumors, suggest laboratory experiments and a small human trial, raising hopes that electric fields will become a new weapon for stalling the progression of cancer. The research, performed by an international team led by Yoram Palti of the Technion-Israel Institute of Technology in Haifa, is explained in the August issue of Physics Today.

In the studies, the research team uses alternating electric fields that jiggle electrically charged particles in cells back and forth hundreds of thousands of times per second. The electric fields have an intensity of only one or two volts per centimeter. Such low-intensity alternating electric fields were once believed to do nothing significant other than heat cells. However, in several years' worth of experiments, the researchers have shown that the fields disrupt cell division in tumor cells placed on a glass dish (in vitro).

After intensively studying this effect in vitro and in laboratory animals, the researchers started a small human clinical trial to test its cancer-fighting ability. The technique was applied to ten human patients with recurrent glioblastoma multiforme (GBM), a form of brain cancer with a very low survival rate. All the patients had their earlier tumors treated by other methods, but the cancer had started to recur in all cases. Fitting the patients with electrodes that applied 200 kHz electric fields to the scalp at regular intervals for up to 18 hours per day, the researchers observed that the brain tumors progressed to advanced stages much slower than usual (taking a median time of 26 weeks), and sometimes even regressed. The patients also lived considerably longer, with a median survival time of 62 weeks. While no control group existed, the results compared favorably to historical data for recurrent GBM, in which the time for tumor progression is approximately 10 weeks and the typical survival time is 30 weeks. In addition, 3 of the 10 patients were still alive two years after the electrode therapy started. These results were announced in a recent issue of The Proceedings of the National Academy of Sciences The Physics Today article explains these results in terms of the physical mechanisms that enable the electric fields to affect dividing cancer cells. In vitro, the electric fields were seen to have two effects on the tumor cells. First, they slowed down cell division. Cells that ordinarily took less than an hour to divide were still not completely divided after three hours of exposure to an electrical field of 200 kHz. Another group consisting of Luca Cucullo, Damir Janigro and their colleagues at the Cleveland Clinic, slowed cell division by applying electric fields with a much lower frequency just 50 Hz. In addition, this protocol demonstrated the ability to decrease the intrinsic drug resistance of the cells. What causes cell division to slow down" In the 200-kHz case, the electric fields hamper the formation and function of a key cell structure known as the mitotic spindle. The spindle is composed of cell components known as microtubules. The microtubules in turn contain components that have a high electric dipole moment, in which there is a large separation of opposite electric charges. Therefore, parts of the mitotic spindle are greatly influenced, and apparently disrupted, by an electric field. The second effect of the 200 kHz fields is that they sometimes disintegrated the daughter cells just before they split off from their partners. The dividing cells sometimes destruct because a high-electric-field region develops between the two daughter cells. This leads to a large slope, or gradient, in the electric field from each daughter cell to this region. This gradient may rip organelles (cell structures) and macromolecules (such as proteins) from the scaffolding of the cells.The alternating electric fields are believed to have similar effects in the human glioblastomas. In contrast, the electric-field treatment poses little danger to normal brain tissue, because healthy brain cells do not divide. The electric fields were only observed to have

Alternating electric fields affect tumor cells by (a) slowing their division time from under one hour to more than three hours. The fields also (b,c) disintegrate cells in the later stages (Image credit: Physics Today)

Page 30: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

disruptive effects on dividing cells. Based on the success of their initial human study, the researchers are working on another human clinical trial, this time with a control group receiving chemotherapy. The researchers are also investigating the possibility of combining the electric-field therapy with low-dose chemotherapy.

Provided via EurekAlert!

Page 31: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Weizmann Institute of Science

Weizmann Institute scientists discover a control mechanism for metastasis

Metastasis is the main cause of cancer death. A team of scientists at the Weizmann Institute of Science has now revealed new details about the mechanisms controlling metastasis of breast cancer cells. Their findings, published recently online in Nature Cell Biology, add significantly to the understanding of metastasis and may aid, in the future, in the development of anti-cancer drugs. For a cell such as a cancer cell to migrate, it first must detach itself from neighboring cells and the intercellular material to which it is anchored. Before it can do this, it receives an order from outside the cell saying: 'prepare to move.' This signal takes the form of a substance called a growth factor, which, in addition to controlling movement, can activate a number of processes in the cell including division and differentiation. The growth factor attaches to a receptor on the cell wall, initiating a sequence of changes in the cellular structure. The cell’s internal skeleton – an assembly of densely-packed protein fibers – comes apart and the protein fibers then form thin threads on the outside of the cell membrane that push the cell away from its neighbors. In addition, a number of protein levels change: some get produced in higher quantities and some in less. To understand which proteins are modulated by the growth factor and the nature of the genetic mechanisms involved in cancer cell migration, a team of researchers pooled their knowledge and resources. This team was headed by Prof. Yosef Yarden of the Weizmann Institute’s Biological Regulation Department and his research group and Tal Shay, a student in the group of Prof. Eytan Domany of the Physics of Complex Systems Department and Prof. Gideon Rechavi of the Chaim Sheba Medial Center at Tel Hashomer.

To begin with, the team mapped all of the genetic changes that take place in the cell after the growth factor signal is received. As they sifted through the enormous amount of data they received, including details on every protein level that went up or down, one family of proteins stood out. Tensins, as they’re are called, are proteins that stabilize the cell structure. But to the scientists' surprise, the amounts of one family member rose dramatically while, at the same time, the levels of another dropped. Despite the familial similarity, the team found a significant difference between them. The protein that drops off has two arms: One arm attaches to the protein fibers forming the skeleton, and the other anchors itself to the cell membrane. This action is what stabilizes the cell’s structure. The protein that increases, on the other hand, is made up of one short arm that only attaches to the anchor point on the cell membrane. Rather than structural support, this protein acts as a kind of plug, blocking the anchor point, and allowing the skeletal protein fibers to unravel into the threads that push the cells apart. The cell is then free to move, and, if it’s a cancer cell, to metastasize to a new site in the body. In experiments with genetically engineered cells, the scientists showed that the growth factor directly influences levels of both proteins, and that these, in turn, control the cells’ ability to migrate. Blocking production of the short tensin protein kept cells in their place, while overproduction of this protein plug increased their migration. Next, the scientists carried out tests on tumor samples taken from around 300 patients with inflammatory breast cancer, a rare but swift and deadly form of the disease, which is associated with elevated growth factor levels. The scientists found a strong correlation between high growth factor activity and levels of the 'plug' protein. High levels of this protein, in turn, were associated with cancer metastasis to the lymph nodes – the first station of migrating cancer cells as they spread to other parts of the body. In another experiment, the scientists examined the effects of drugs that block the growth factor receptors on the cell walls. In patients who received these drugs, the harmful 'plug' proteins had disappeared from the cancer cells. Prof. Yarden: 'The mechanism we identified is clinically important. It can predict the development of metastasis and possibly how the cancer will respond to treatment.' This discovery may, in the future, aid in the development of drugs to prevent or reduce the production of the unwanted protein, and thus prevent metastasis in breast or other cancers.

Provided via EurekAlert!

Page 32: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Florida State University

FSU chemists using light-activated molecules to kill cancer cells

A key challenge facing doctors as they treat patients suffering from cancer or other diseases resulting from genetic mutations is that the drugs at their disposal often don’t discriminate between healthy cells and dangerous ones -- think of the brute-force approach of chemotherapy, for instance. To address this challenge, Florida State University researchers are investigating techniques for using certain molecules that, when exposed to light, will kill only the harmful cells.

Igor V. Alabugin is an associate professor of chemistry and biochemistry at FSU. He specializes in a branch of chemistry known as photochemistry, in which the interactions between atoms, small molecules and light are analyzed. “When one of the two strands of our cellular DNA is broken, intricate cell machinery is mobilized to repair the damage,” he said. “Only because this process is efficient can humans function in an environment full of ultraviolet irradiation, heavy metals and other factors that constantly damage our cells.” However, a cell that sustains so much damage that both DNA strands are broken at the same time eventually will commit suicide -- a process known as apoptosis. “In our research, we’re working on ways to induce apoptosis in cancer cells -- or any cells that have harmful genetic mutations -- by damaging both of their DNA strands,” Alabugin said. “We have found that a group of cancer-killing molecules known as lysine conjugates can identify a damaged spot, or ‘cleavage,’ in a single strand of DNA and then induce cleavage on the DNA strand opposite the damage site. This ‘double cleavage’ of the DNA is very difficult for the cell to repair and typically leads to apoptosis.” What’s more, the lysine conjugates’ cancer-killing properties are manifested only when they are exposed to certain types of light, thus allowing researchers to activate them at exactly the right place and time, when their concentration is high inside of the cancer cells, Alabugin said. “So, for example, doctors treating a patient with an esophageal tumor might first inject the tumor with a drug containing lysine conjugates,” he said. “Then they would insert a fiber-optic scope down the patient’s throat to shine light on the affected area.” The light exposure would activate the drug, leading to double-strand DNA damage in the cancerous cells -- and cell death -- for as much as 25 percent to 30 percent of the cells in the tumor, at a rate that rivals in efficiency any of the highly complex and rare DNA-cleaving molecules produced by nature, Alabugin said -- and, perhaps just as importantly, avoids damage to healthy cells. For tumors located deeper within the body, he pointed to other studies showing that a pulsed laser device can be used to penetrate muscle and other tissues, thereby activating the drugs using near-infrared beams of light.

As proof of principle to the idea that lysine conjugates possess anti-cancer activity, Alabugin collaborated with cancer biologist Dr. John A. Copland of the Mayo Clinic College of Medicine in Jacksonville, Fla. In their tests, several of the molecules demonstrated little effect upon cultured cancer cells -- in this case, metastatic human kidney cancer cells -- without light, but upon phototherapy activation killed more than 90 percent of the cancer cells with a single treatment. Future work will include demonstrating anti-cancer activity in an animal model. Successful completion of the preclinical studies then could lead to clinical trials with human patients.

Provided via EurekAlert!

Page 33: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Rice University

New cancer weapon: nuclear nanocapsules

Nanotubes packing powerful alpha-emitters could target lone cancer cells

HOUSTON, Aug. 23, 2007 – Rice University chemists have found a way to package some of nature's most powerful radioactive particles inside DNA-sized tubes of pure carbon -- a method they hope to use to target tiny tumors and even lone leukemia cells.

"There are no FDA-approved cancer therapies that employ alpha-particle radiation," said lead researcher Lon Wilson, professor of chemistry. "Approved therapies that use beta particles are not well-suited for treating cancer at the single-cell level because it takes thousands of beta particles to kill a lone cell. By contrast, cancer cells can be destroyed with just one direct hit from an alpha particle on a cell nucleus." The study's results are available online and slated to appear in an upcoming issue of the journal Small.

In the study, Wilson, Rice graduate student Keith Hartman, University of Washington (UW) radiation oncologist Scott Wilbur and UW research scientist Donald Hamlin, developed and tested a process to load astatine atoms inside short sections of carbon nanotubes. Because astatine is the rarest naturally occurring element on Earth -- with less than a teaspoon estimated to exist in the Earth's crust at any given time -- the research was conducted using astatine created in a UW cyclotron.

Astatine, like radium and uranium, emits alpha particles via radioactive decay. Alpha particles, which contain two protons and two neutrons, are the most massive particles emitted as radiation. They are about 4,000 times more massive than the electrons emitted by beta decay -- the type of radiation most commonly used to treat cancer. "It's something like the difference between a cannon shell and a BB," Wilson said. "The extra mass increases the amount of damage alpha particles can inflict on cancer cells." The speed of radioactive particles is also an important factor in medical use. Beta particles travel very fast. This, combined with their small size, gives them significant penetrating power. In cancer treatment, for example, beams of beta particles can be created outside the patient's body and directed at tumors. Alpha particles move much more slowly, and because they are also massive, they have very little penetrating power. They can be stopped by something as flimsy as tissue paper. "The unique combination of low penetrating power and large particle mass make alpha particle ideal for targeting cancer at the single-cell level," Wilson said. "The difficulty in developing ways to use them to treat cancer has come in finding ways to deliver them quickly and directly to the cancer site."

In prior work, Wilson and colleagues developed techniques to attach antibodies to carbon fullerenes like nanotubes. Antibodies are proteins produced by white blood cells. Each antibody is designed to recognize and bind only with a specific antigen, and doctors have identified a host of cancer-specific antibodies that can be used to kill cancer cells. In follow-up research, Wilson hopes to test the single-celled cancer targeting approach by attaching cancer-specific antibodies to astatine-loaded nanotubes.

One complicating factor in any astatine-based cancer therapy will be the element's short, 7.5-hour half-life. In radioactive decay, the term half-life refers to the time required for any quantity of a substance to decay by half its initial mass. Due to astatine's brief half-life, any treatment must be delivered in a timely way, before the particles lose their potency.

Provided via EurekAlert!

Page 34: Innovations Report · 2015-07-14 · The theoretical model shows that it works. The atom is brought close to the nanowire. Two photons are sent towards the atom and when they hit

Massachusetts Institute of Technology

MIT creates 3-D images of living cell

A new imaging technique developed at MIT has allowed scientists to create the first 3D images of a living cell, using a method similar to the X-ray CT scans doctors use to see inside the body.

The technique, described in a paper published in the Aug. 12 online edition of Nature Methods, could be used to produce the most detailed images yet of what goes on inside a living cell without the help of fluorescent markers or other externally added contrast agents, said Michael Feld, director of MIT's George R. Harrison Spectroscopy Laboratory and a professor of physics. “Accomplishing this has been my dream, and a goal of our laboratory, for several years,” said Feld, senior author of the paper. “For the first time the functional activities of living cells can be studied in their native state.” Using the new technique, his team has created three-dimensional images of cervical cancer cells, showing internal cell structures. They've also imaged C. elegans, a small worm, as well as several other cell types.

The researchers based their technique on the same concept used to create three-dimensional CT (computed tomography) images of the human body, which allow doctors to diagnose and treat medical conditions. CT images are generated by combining a series of two-dimensional X-ray images taken as the X-ray source rotates around the object. “You can reconstruct a 3D representation of an object from multiple images taken from multiple directions,” said Wonshik Choi, lead author of the paper and a Spectroscopy Laboratory postdoctoral associate. Cells don't absorb much visible light, so the researchers instead created their images by taking advantage of a property known as refractive index. Every material has a well-defined refractive index, which is a measure of how much the speed of light is reduced as it passes through the material. The higher the index, the slower the light travels. The researchers made their measurements using a technique known as interferometry, in which a light wave passing through a cell is compared with a reference wave that doesn't pass through it. A 2D image containing information about refractive index is thus obtained. To create a 3D image, the researchers combined 100 two-dimensional images taken from different angles. The resulting images are essentially 3D maps of the refractive index of the cell's organelles. The entire process took about 10 seconds, but the researchers recently reduced this time to 0.1 seconds. The team's image of a cervical cancer cell reveals the cell nucleus, the nucleolus and a number of smaller organelles in the cytoplasm. The researchers are currently in the process of better characterizing these organelles by combining the technique with fluorescence microscopy and other techniques.

“One key advantage of the new technique is that it can be used to study live cells without any preparation,” said Kamran Badizadegan, principal research scientist in the Spectroscopy Laboratory and assistant professor of pathology at Harvard Medical School, and one of the authors of the paper. With essentially all other 3D imaging techniques, the samples must be fixed with chemicals, frozen, stained with dyes, metallized or otherwise processed to provide detailed structural information. “When you fix the cells, you can't look at their movements, and when you add external contrast agents you can never be sure that you haven't somehow interfered with normal cellular function,” said Badizadegan. The current resolution of the new technique is about 500 nanometers, or billionths of a meter, but the team is working on improving the resolution. “We are confident that we can attain 150 nanometers, and perhaps higher resolution is possible,” Feld said. “We expect this new technique to serve as a complement to electron microscopy, which has a resolution of approximately 10 nanometers

Provided via EurekAlert!