Immune cells attracted to nooks in this porous, biodegradable polymer implant are stimulated to attack cancer. Credit: Omar Ali
1. A polymer implant signals cells to combat cancer. This was already covered at Nextbigfuture in Jan 2009 but we now have a photo and some more information and it is an important development to track.
Infection-mimicking materials to program dendritic cells in situ. Nature Materials.
A new implant attracts immune cells and exposes them to molecules that stimulate them to attack cancerous tumors. When tested in mice that normally die of cancer within 25 days, the implants allowed 90 percent of the mice to survive. Similar experimental therapies based on transplanting immune cells are only about 60 percent effective.
Why it matters: The implants could eventually be used to treat human cancers that don’t respond to other therapies, and they could also be used to treat immune disorders such as type 1 diabetes and arthritis. Other approaches that involve stimulating immune cells haven’t proved successful in clinical trials. Those techniques require the cells to be removed from the body and then reimplanted; many are damaged in the process and die, while survivors often fail to trigger attacks on cancerous tumors. The new implant stimulates cells inside the body, without subjecting them to stressful procedures.
2. Microscopists have given aberration-corrected transmission electron microscopy the power to reveal atomic structures with unprecedented precision. It is now up to materials scientists to use this power for extracting physical properties from microscopic atomic arrangements.
Why are the images that appeared in the literature for so long not necessarily atomically resolved? Resolving a structure with atomic resolution means that the information must be entirely local on the atomic level. Any change in the position or occupancy of an atomic site in the sample must show up in the image as an individual signal localized only at the corresponding atomic position. In this stringent sense, apart from a few favourable cases, only the images obtainable in modern aberration-corrected instruments match these standards. The concept is illustrated below.
a–c, Images corresponding to different widths of aperture in the back focal plane (f) of the microscope’s objective lens selecting the beams that contribute to image formation. In the sequence a to c, the sample contains a microhole consisting of an empty atom column. With the smallest aperture (the full circle in f), only the transmitted and four diffracted beams are used to form the image. This situation corresponds to classical uncorrected electron microscopy where wider apertures cannot be tolerated because of the effect of lens aberrations. Superficially, a appears to show an atomic lattice, but this bears little resemblance to the real lattice imaged in c with the widest aperture setting (dashed line in f). Atomic resolution is only provided in c. Note the artefacts in b taken with the intermediate aperture size (dash-dotted line in f). Two ‘atomic maxima’ appear on both sides of the hole. In a the hole appears, unrealistically, to be a more intense maximum. Images d and e, corresponding to the smallest and the largest aperture size, respectively, show simulations for a 30% reduction in occupancy of the atomic position marked with an arrow in e. To reveal this reduction in occupancy, the large aperture used in an aberration-corrected instrument is essential.
An investigation using Gaussian regression analyses revealed that such position measurements can be carried out at a precision of better than 5 picometer (pm) (at a 95% confidence level). Such a precision is far superior to that of any other microscopic technique, including the scanning transmission electron microscope or even the scanning tunnelling microscope. The standard objection to such extremely high precision is that it cannot be allowed by a microscope with a 70-pm Rayleigh or point resolution. However, resolution and precision are two separate physical terms. Although resolution is defined by the minimum separation of two optically broadened intensity peaks at which these can just be separated in the image, the distance between two well-isolated peaks, fitted for example by two-dimensional Gaussians, can be measured at a precision more than an order of magnitude higher. It would be a real pity to turn down the extraordinary opportunities offered by aberration-corrected electron microscopy because of an unfortunately common misunderstanding.
3. New algorithms double flash capacity without shrinking transistor size.
Researchers at Toshiba and SanDisk, a maker of flash memory devices in Milpitas, CA, have built a 64-gigabit chip that holds four bits of data per memory cell, twice as much as the cells in conventional chips. The company expects its chips to go into production within the first half of 2009.
Why it matters: To increase the amount of data that can be stored in memory chips, engineers typically shrink the transistors that make up the individual memory cells. However, as transistors get smaller, their reliability tends to decrease because they generate more heat and leak more electrical current. While SanDisk researchers are still exploring ways to make transistors smaller without compromising reliability, the new approach makes it possible to store more data without shrinking transistors.
4. Thomas Edison famously said, is 1 percent inspiration and 99 percent perspiration. This ratio is being radically shifted with advanced automation and machine learning.
Tools such as 3-D modeling and digital prototyping have already taken much of the grunt work out of invention. With the advent of genetic programming and other machine learning techniques, however, software stands poised to take over higher-level aspects of invention as well; these “genies” could conjure up products on the basis of nothing more than general human “wishes.” The U.S. Patent and Trademark Office has already granted patents to products designed largely by software; in his book, Plotkin cites the Oral-B CrossAction Toothbrush as an example.
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.