Pages

2018-02-18

Intersection of robotics and nanotechnology enables targeted killing of cancerous tumors

Chinese-American nanoscience made a Great Leap Forward through the collaborative effort of the National Center for Nanoscience and Technology (NCNST) in Beijing and Arizona State University’s Biodesign Institute's Center for Molecular Design and Biomimetics: in a first in vivo murine study, autonomous nanorobots proved to be intelligent delivery vehicles capable of causing complete cancer regression within a few days. DNA nanorobots employed one of the new drug delivery methods (which have always been a fundamental strength of nanotechnology) with thrombin-loaded DNA programmed to respond to a molecular trigger to fold into itself like an origami sheet and subsequently, like a tiny machine, deploy thrombin at the targeted point. By injecting tumor-associated blood vessels with thrombin that cut off tumor blood supply within 24 hours, nanorobots caused tumor cell shrinkage and necrosis. Most notably, clotting did not occur in healthy tissues other than those programmed for targeting.  Significantly, in a control study of side effects in porcines, healthy tissues also remained unaffected. Once fully tested and developed for human use, the technology will obviate the need for most chemotherapy models as well as use of targeted drugs, because elimination of blood supply limited to tumor cells yields far more precise results.

Now for the real hurdle: overcoming opposition to approval for human use by vested interests in the multi-billion chemotherapy and radiation therapy industry. Luckily, and quite significantly, this technology did not originate in Lobbyland, and following very recent reforms of the Chinese drug and device approval process, chances are that Chinese approvals of nanorobot therapy will be way faster, securing East Asia’s foothold in the future of cancer therapy. That would, of course, happen not a moment too soon, given the explosion of cancer rates in China, largely due to severe carcinogenic environmental pollution in heavily industrialized parts of the country that already experiences a wide array of consequences of limited effectiveness of environmental regulation, held back in favor of rapid and profitable industrialization. But the interesting observation is that forum shopping to defeat inefficient bureaucracies is gaining ground in science and technology and with startup environments, just as it did in litigation, taxation, treaty shopping and multiple other areas: market players vote with their feet on the quality, efficiency and stimulation effects of regulation, and pass value judgment on its overall utility.

2018-02-01

Going against the grain

Philippe Legrain is one of the most attractive independent thinkers at LSE – one might label him a contrarian with at least some good cause – in the populist universe that gave us Brexit. What I find most amusing is that his almost trivial economics lends to leave opponents with next to no plausible logic to contradict him. His recent scholarship precipitated richly on the obvious: why immigration boosts the GDP. It is only in an era of mass stultification through grotesquely emotionalized nationalist arguments pitching to a base blissfully unaware of the lessons of history and without comprehensible basis in economic fact, that this would support a “controversial” intellectual agenda over the years as it has in Legrain’s case while he argued for a “European Spring.”


Now, it is hardly a revolutionary idea that doing the right thing from a humanitarian perspective turns out yielding dividends and is actually good business. What is a welcome change, however, is to voice such substantiated Merkelism at a time of Realpolitik when the pendulum of irrational fears and anger swings in the opposite direction in so many places.


In Refugees Work: A Humanitarian Investment That Yields Economic Dividends, a paper that appears to be the first comprehensive international study of its kind, Legrain shows how refugees contribute to advanced economies, and found that they double the host society’s investment in them over a period of just five years. Would that the same could be said about each dollar invested in aging natives. Refugees contribute economically as workers of all skill levels, entrepreneurs, innovators, taxpayers, consumers and investors. The “diversity dividend” is, in fact, remarkable: more than three in four patents filed in 2011 at the top-ten patent-generating U.S. universities were attributed to at least one foreign-born inventor, while in Britain, migrants were found to be almost twice as likely to start a business as are locals. The most entrepreneurial migrants in Australia are refugees. One-third of recent refugees in Sweden are college graduates while two-thirds of those have skills that match current graduate job vacancies.


While it would certainly appear that Legrain falls victim to his own propaganda and pitches his findings as actual solutions, without applying the same strict scrutiny to the cost side of his cost-benefit analysis, there is no question that many if not all of the most successful synthetic societies in modern history leveraged economic contributions made by refugees: the United States, Australia, New Zealand, Canada, Israel, but also Germany and several South American nations come to mind. In turn, refusal to quickly integrate and invest in refugees has proved disastrous in the case of Arab Nations with regard to Palestinians, and across Africa with its numerous displacements in the wake of seismic political shifts. Although even prolific generators of refugees can end up beneficiaries: remittances from abroad to Liberia, for example, amount to 18.5% of its GDP. Some of Legrain’s findings are fallacious on their face, or, rather, reflect trivialities: so, for example, the “discovery” that recognition of foreign qualifications ought to be streamlined, since it costs only £25,000 to train a refugee doctor to practice in the U.K., while it costs over £250,000 to “mint” a new British physician. If only social engineering were that simple, and could afford to ignore blowback from professions and market segments facing competition from immigrants. Refugees are not, and cannot be, the sole priority and consideration in balancing social interests.


Among the interesting findings of Legrain’s study is that the U.S. is more successful than the EU at getting refugees to work: their greater initial investment results in a higher rate of employment than for people born in the U.S., with earnings rising sharply over time while reliance on social assistance declines rapidly. If, as is plausible, the first priority should be to get refugees to work early, then granting asylum seekers right to work while their claims are being reviewed, as is done in Canada or Sweden, but not in the U.S., is an act of simple pragmatism, not of principle. Legraine is right that policy ought to combine the active assistance of the Swedish model with the job and entrepreneurial opportunities in U.S. practice. Refugees ought to be resettled where there are jobs, not in areas where cheap housing is available but jobs aren’t, and the same is true of rigid labor markets privileging insiders at the expense of outsiders, not to mention stifling entrepreneurship. While government assistance for refugees ought to be generous, prompt and wide-ranging initially, open-ended welfare provisions not only create a moral hazard but also have, on balance, a negative impact. Serious analytic examination of economic benefits of diversity, initiated inter alia by Legrain, has far from ended and has barely begun to demonstrate its predictable wealth of results.

2018-01-13

The Thiel Fellowship vs. the virtuoso principle in professional education

Thoughts about contrarian tipping points for conventional wisdom

Citior, altior, fortior - faster, higher, stronger. The venerable Olympic motto has left the sports arena and turned us all into valiant gladiators. It is a race we are inevitably bound to lose – against machines and cyborgs as much as against human mutants who undisputedly excel at quantitative criteria but have contributed precious little to the qualitative quantum leaps that made up the ultimate sum total of evolution.

At the same time, we are witnessing the unprecedented olympification of all areas of life. The day cannot be far when every aspect of our existence, everyone and everything with which we come in contact will be ‘rated.’ From credit ratings to U.S. News & World Report to Zagat’s, it is the equivalent of cameras in the courtroom: people start acting for the camera, and for the ranking, to the detriment of any other priority or independent substantive consideration. I am not saying it is all bad – witness body cameras for police, and think of transparency in public companies – but, once a tipping point is reached, it does create powerful opportunities for contrarian approaches: think leveraged buyouts and private equity, think investment in distressed assets, think off-balance-sheet items, think emphasizing criteria the jury didn’t. It is the very essence of market forces as opposed to Marxist planned economies to award rewards for what has not been planned for, what went unpredicted, and, hence, unrated.

I challenge you to name a dozen Wunderkinder, child or youthful prodigies who later won significant recognition or even just awards for lifetime achievement. There was Mozart, yes. Now, keep going!

It would just be willful blindness to overlook the evidence: our time is witnessing a spectacular rise in extremely successful college dropouts. In fact, the vast majority of billionaires falls into that category – whatever value one might wish to accord a high net worth, they must be doing something right. Think of Bill Gates, Steve Jobs, Larry Ellison, Frank Lloyd Wright, Buckminster Fuller, James Cameron, Mark Zuckerberg, Tom Hanks, Harrison Ford, Lady Gaga, Tiger Woods, Roman Abramovich, Ben Affleck, André Agassi, Christina Aguilera, Paul Allen, Woody Allen, Brooke Astor (AND her husband!). Why should I get carpal tunnel syndrome typing honorable mentions for the legions of those sore “underachievers” when you can read the “hall of shame” of college dropouts all by yourself?

The rest of us make up for the ever-growing pool of wannabe Olympians – measured by purely quantitative criteria, because these are so wonderfully indisputable. Yes, they are indisputable because they are also almost completely meaningless, and it is, after all, impossible to argue persuasively with nonsense. Our standardized tests relate to testing relevant aptitude much as the ability to jump 5 yards would be predictive for admission to a police academy: something just tells you that neither Sherlock Holmes nor Inspector Jacques Clouseau were capable of doing it – or needed to. But it is such a fair, objective, and, above all, non-discriminatory criterion! Who cares if Sherlock Holmes doesn’t make the cut-off: we know in our guts that, just like Jessica Fletcher, he will ultimately not need a police badge to solve the darndest cases. It gets a little trickier in licensed professions, though. What distorts these tests is that, because of the substantial numbers of test-takers, a good part will excel both at relevant and at (the painstakingly measured) irrelevant criteria. Of course, it is the former the test administrators and their clients will shamelessly claim credit for.

Standardized tests, the panacea of “selective institutions of higher learning,” have shunted deep and slow thinkers to the side to give way to those that excel at quick access to memorized data and to measurable standardized solutions. Were Galilei or Newton’s apple measurable? That’s not to say a lawyer or a doctor ought not be able to think quickly on her feet, but only very rarely does it matter. The other times, it matters mostly for quantitative productivity: that is to say, it indicates one’s ability to resolve déjà vu problems quickly and with minimal reflection. Speed is an entirely useless criterion in the quest for resolving novel, cutting-edge challenges. But then, junior-level professionals are not even expected to tinker with those, and by the time they reach a level where actual thinking may be not only permitted but required as part of their job description, they can be trusted to have learned all tricks of simulating creativity.

There is another consequence to standardized tests, and it has thoroughly corrupted higher education: no matter how cleverly they are designed, all standardized tests inevitably become “learnable.” While this gives rise to profitable cottage industries concerned with test-prepping, it once again skews test results in favor of those able to afford it. If only learning for the test would overlap with learning for life – but it doesn’t. There is, quite simply, no utility whatsoever in one’s ability to retain, for a few weeks at most, inordinate volumes of arcane SAT or GRE vocabulary otherwise ably provided by Merriam Webster. That, or speed-reading, is how we select who should be granted a license to grow their mind? Really?

And this is also exactly why financially quantifiable success based on qualitative innovation has since some time become the domain of the college dropout. The very same principle can be applied, mutatis mutandis, to professionals who, by implication of our licensing laws, cannot have dropped out of college but have faced a very similar dilemma at the level of graduate and professional schools.

Jobs that enable one to repay one’s educational loans within tolerable time and with tolerable lifestyle compromises are restricted to graduates of roughly a dozen professional schools nationwide in each area. That’s where employers of a certain caliber recruit – with a few notable but statistically not terribly significant exceptions. The rest cannot find jobs that will keep them out of quasi-poverty after debt service. How do we know how to identify these gatekeeper schools? Quite easily: they all come ranked, mostly by U.S. News & World Report (and a few others who base their raison d’être on not being U.S. News and offering some alternative but rarely more meaningful ranking criteria). Oh, ranking! What a genius business model, based on a delusionary distortion of the Olympic motto. The self-fulfilling prophecy of yet another substantively unsupportable racket.

It is also the reason why an unprecedented cult of youth has taken hold, ignoring the fact that excellence is not confined to the 18-28 age bracket. But combination of acceptable debt service and retirement savings is.

Once again, all these conclusions point us to a search for the road less traveled: because the beaten path holds very little promise, except an early death by hamster wheel.

If you add to that the fact that many employers of licensed junior professionals have concluded that professional schools have taught their entry-level arrivals few things of value, this, in essence, reduces their academic diplomas to an expensive (and meaningless) kind of intelligence test that comes with decades-long indebtedness. What follows this exercise is another decade of sorely needed professional training, this time under practical mentors who set real-world priorities, i.e., not “philosophy of…”, “doctrine of…”, “concept of….” but actual solutions that hold up to the tempest of practical challenges.

Well, one thoroughly fascinating alternative concept is the Thiel Fellowship. Whatever one may choose to think of the political and other meanderings of its founder, it is what only a contrarian can come up with. Not concerned with speed-reading, paper chases or other quantitatively measurable academic caprioles standardized tests are so fond of, it provides fellows with $100,000 in grant money for dropping out of college and pursuing real-life implementation of an idea judged worthy of the effort and sacrifice. Peter Thiel, one of the most creative entrepreneurs of recent years, a chess master, co-founder of PayPal and early funder of Facebook, will strike many as a living contradiction to his own ideas, since he is, after all, a graduate of Stanford (B.A. in Philosophy and a J.D. from the Stanford Law School). But I see no contradiction here at all – rather a need for vigorous generalization. Under Thiel’s approach, no ranking of the outcomes is necessary. The thing speaks for itself, in the language of Justice Potter Stewart: we know success when we see it. No need to make presumptuous ‘decisions’ whether robotics, biotech or the next BigData app are more important for the future of a generation about to be spawned. Let’s not forget, after all, what ranking has really become about: the nod of the ranker – and no other merit, whatever effort is made to cloak subjective outcomes in a costume of objectivity.

While a junior assistant or paper-pusher or suitcase-bearer might well be judged by his or her “efficiency” at menial practical tasks that lend themselves to quantitative scoring, that should not be applied if their employer had the development of significant potential of people on its agenda.

It took Einstein seven years to incubate Special Relativity and then ten years for General Relativity. It is fair to assume that no scientific funding organization would have supported this process patiently with grants (and, in fact, no one did – long-haired Albert worked at the patent office in Bern to support himself, passed over for ‘promotion’ twice “until he fully mastered machine technology”).

Yes. I know. Not every seven years spent brooding over a bizarre idea will result in anything even vaguely resembling greatness. Ask any PhD in Mathematics. In fact, the percentage of this happening will be lousier than the survival rates of start-up companies. It is equally true that there are few alternatives to such brooding other than a DARPA-like approach.

Part of the regulatory madness that makes taking companies private so attractive is the insanity of measuring strategic management performance by quarterly reports and prospectuses. A systemic prescription for failure and abuse? You bet. But one devilishly perplexing temptation is inherent in quantitative scoring that has even moderate acceptance: controversies and justified criticism aside, right or wrong, numbers will be relied upon for selection and allocation processes, only because they are so easy to use, and, more importantly, ‘because they are there.’

To force an individualized, qualitative, subjective look at the individual or decision in question would require considerably greater expense of time and depth of analysis and comparison. Besides, it would invariably give rise to challenges on grounds of fairness, comparability, discrimination and similar objections – simply because such obvious if irrelevant points of attack are inherent in any non-quantitative review. But do markets care what’s ‘fair’ or ‘objective’? Back to Sherlock Holmes, Jacques Clouseau, or Jessica Fletcher: res ipsa loquitur and nothing else.

Just like religions can be shown to have been molded, if not to have been designed, to suit the interests of clerics and not of their respective flocks, the stark realities in higher education beg a similar question: are substance and processes designed at all to serve the interests not only of students but of prospective employers – whose overarching interest must obviously be a maximum of innovative, creative and productive staffers?

Take law school as a classical example: past a single year of common core, of elementary foundational classes, it matters not if there is any utility to class selection in the following years. On the contrary, giving maximum weight to speculative, theoretical and abstract classes is encouraged by faculty who argue that this is students’ ‘last opportunity to acquaint themselves with questions of a philosophical nature, not dictated by utility.’ A quick aside – does real life reward anything at all that does not rise to standards of utility? When did it? But it is, of course, so much easier to teach an abstract class of notions, concepts, and obscure yet controversial doctrines, than to present something that bears the almost objectionable aura of black-letter law. Employers have long given up on expecting graduates to have learned useful things in law school beyond the mere ability to think, and are entirely prepared to start what they consider meaningful education from zero once post-sheepskin reality settles in.

Today, most higher education is a compelling example of systemic failure, especially when one takes total cost to society of its irrational, inaccurate and inefficient selection mechanisms into account. At a national credit risk in excess of $1 trillion, student debt has become the next perfectly foreseeable financial albatross, just like S&L guarantees, the Dot Com bubble and subprime mortgages had been – long before evidence thereof was considered ‘reliable’ enough by special interests that favored the status quo. Each, an unforeseeable, unpredictable crisis? Permit me not to comment. What happened was, in fact, inevitable – only its exact occurrence remained somewhat uncertain.

A functional – although perhaps not formal – equivalent of the Thiel Fellowship will be of utmost importance for the future of professional education. Almost alone among civilized nations, the U.S. affords the incomprehensible luxury of requiring professional education to begin at the graduate rather than at the undergraduate level. It cannot be the purpose of tertiary education to make up for the sometimes indisputable weaknesses of secondary education by mandating an extremely costly four extra years rather than weeding out its victims by aggressively substantive admissions testing. How completing an arcane and often entirely unrelated undergraduate major makes for better lawyers may be an argument for another time and place. However, any meaningful incentive that causes effects similar to those encouraged by the Thiel Fellowship has to find very fertile ground. It would revolutionize professional education as we know it, by stripping it of trappings that matter nowhere outside of academia and that are a direct violation of carpe diem from the macroeconomic perspective of allocating national resources where they yield their most significant returns.

2017-12-01

Winning Formula M

Whoever thought that Bernie Ecclestone’s departure from Formula One would be the end of an era had absolutely zero perspective on nanotechnology: there is plenty of room at the bottom, as nanotech’s earliest and, to this day, greatest visionary Richard Feynman declared at Caltech in 1959 – at a time when not only many members of today’s six nano-racing teams but also their academic mentors were not even born yet.

For there is an emergent dimension of molecular racing, and its first event took place at Toulouse, at a Grand Prix race track created by CEMES, the Centre d’Élaboration de Matériaux et d’Etudes Structurales of the Centre National de la Recherche Scientifique this April 28-29, 2017. It is not an experimental playing field for silly geeks who never outgrew their dad’s garage: just like video games produced considerable collateral benefits for the “real world” (think drone technologies, or remote surgery) the purpose of molecular machines is to eventually perform useful work on an atomic scale. Nanorobots will be able to manipulate individual atoms or molecules, manufacture amazing new materials or transport substances in a heretofore unknown targeted fashion.

The six race cars starting in Toulouse were 1-2 nm in size and had to be moved through minimal electric impulses on their own motion along a defined track of 150 nm that included handling two chicanes with a 120-degree angle. It took the winning team 90 minutes, temporarily reaching speeds of up to 300 nm per hour. That was an amazing speed in light of the fact that the organizers had calculated 18 hours rather than 90 minutes as a performance limit.

The winner was neither Mercedes nor Ferrari but a joint venture between Rice University, Houston, TX and the University of Graz, Austria. On the two racing days, they moved their cars by scanning-tunneling microscope over a distance of 1,000 nm or 1 micrometer. The runner-up mustered only about one-sixth of their speed. Division of labor turned out useful: while the team at Rice produced the molecules, the team at Graz focused on “training,” i.e., examining them by scanning-tunneling microscope, observing their motion, and “driving” them in the race. While the Rice chemists benefited from decades of experience in producing an exceptionally suited molecule, Graz advanced manipulation of mechanics by optimizing the interaction between molecule and “road” surface. Among the sponsors of competing teams were also three major automotive manufacturers: Peugeot, Toyota and Volkswagen. The rules permitted molecular race cars to be propelled either (i) using the tunnel electrons passing through it; or (ii) light; or (iii) nano-magnetic effects. Molecule cars were allowed to be manipulated mechanically to reach the 2 gold ad-atoms of the starting line. While it may take a while for nano-race cars to command significant TV following and commercial ratings, part of their fascination is that the action is, indeed, observable and will pave the way for infinitely greater sophistication as time for those developments might shrink at a speed comparable to miniaturization itself.

2017-11-01

Edible Nanofibers: A Substantially More Cost-Effective Delivery System for Iron

Iron deficiency anemia is a growing public health problem, above all in developing countries. About 2 billion individuals or about 30 percent of the world’s population are anemic, many due to iron deficiency which can lead to developmental disorders in children. The pharmaceutical industry has developed a range of products for iron fortification of foods to provide the body with the element central for oxygen transport through the bloodstream. But cost, bioavailability and other factors created limitations. Ferrous sulfate, currently a standard treatment for humans, changes color and taste of foodstuffs in an undesirable manner and shows harmful side-effects.

Recent nanotechnology research at ETH Zurich pursued a different avenue: edible nanofibers based on whey protein are loaded with iron nanoparticles and delivered as a food supplement in liquid or powder form. Experiments with rats showed that stomach enzymes were capable of dissolving the whey protein nanofibers completely, while acidic conditions in the human stomach are certain to dissolve the load of iron nanoparticles with greatly improved bioavailability. The study also tested for risk of harmful accumulation of nanofibers or nanoparticles within the organism, since whey protein fibers had never before been used in food stuffs. But not a single indication of accumulation or changes in organs was found. That notwithstanding, additional studies will be needed to establish security for human consumption. Cost-benefit analysis measuring bioavailability and resorption of iron nanoparticles shows a substantial improvement in nutritional supplementation at a particularly low cost of ingredients and manufacturing processes.

2017-10-01

Nanovibrations: Controlling the Material Properties of Crystals

Salts and comparable structures consist of ions. In a solid state, they form ion lattices. Vibrations of ion crystals account, inter alia, for heat transfer and the diffusion of sound. Controlling vibrations may enable control of material properties of crystals. Conceivable innovative applications include thermoelectric nanoelements that could render human body heat usable as power supply of wearable electronic devices, or wafer-thin structures to create sound-proof chambers.

Now, a cooperation of Rutgers University, a center of nanobiology, and the University of Graz, Austria, relying on latest-generation electron microscopy, made ion lattice vibrations visible for the first time in ultra-high atomic and energetic resolution in both spatial and spectral dimension, compared to earlier experiments. The ion lattice of a crystal nanocube was caused to vibrate by targeting it with an electron beam. Depending on the spot on the cube hit by the electron, it can trigger different modes. The more powerful the ion trigger, the more energy is drawn from the electron beam. The individual modes of vibration can be determined from the measured loss of energy.

By causing the electron beam to cover the entire sample, vibrations of the ion lattice could be determined with topological resolution in the nanometer scale and with a very high frequency in the Terahertz range. The phenomena observed under the electron microscope were computer-simulated, which showed for the first time how ions vibrate at different locations on the cube. This may open the door to revolutionary developments in steering sound and heat with heretofore unknown precision.

2017-09-03

Printing Life

It has been some time since 3D printers entered our collective consciousness as useful tools for more than toys and demonstration objects. Even firearms and certainly synthetic prosthetic limbs seem capable of being 3D-printed. As ageing societies face increasing shortages of donor organs for transplantation, the use of 3D print technology in medicine, although perhaps among the most obvious and compelling, is new but likely to disrupt synthetic biology as few tools have done before. Small wonder it won first prize 2016 at the international Genetically Engineered Machine (iGEM) contest, an MIT creation. The method enables printing intact tissues and potentially even entire organs using 3D bioINK tissue printing technology.

While printing biological material such as cartilage is already established state-of-the-art, printing complex cell tissue still presented notable challenges. They were resolved by printing layers of living cells with a 3D printer into a biocompatible matrix in a petri dish. In the past, hydrogels were used to supply a gelatin-like structure that is only later populated by cells. This “scaffolding” complicates printing and creates unnatural coherence between cells.

Instead, the students at two universities in Munich, Ludwig-Maximilian University and Technical University of Munich, developed a proprietary “biological ink” similar to a two-component adhesive to print living cells directly in 3D. Its main component is biotin, also known as vitamin H or B7 that is loaded onto the cellular surface. The second component, streptavidin, is a protein that binds biotin and thus provides the biochemical adhesive proper. In addition, high-volume proteins were equipped with biotin groups in order to create cross-networking structures. When a suspension of these cells is “printed” into a concentrated solution of the protein components, they form the requisite 3D structure and the bioINK tissue printer forms layers of scalable, formable tissue of living cells, ready for transplant. 

2017-08-01

Shepherding Innovation: Two Very Different Models

Innovation as a socio-intellectual phenomenon also reflects the multitude of ways to skin a cat.

In absolute terms, Switzerland has long been recognized as the world’s most innovative country. Of course, the criteria one picks can oddly vary results: if you count patents per capita, Eindhoven, domicile of Philips, is the innovation capital of the world, followed by San Diego. Israel does not even figure on that list.

The secret to Swiss success has been reliance on cheap and abundant capital including foreign investment, highly selected skilled immigrants, and two world-renowned Federal Institutes of Technology (in Zurich and Lausanne). Over 60 percent of R&D expenditures come from the private sector. The country tops the World Economic Forum’s Global Competitiveness Report, the EU’s Innovation Union Scoreboard, the Global Innovation Index, and patent applications in Europe. But despite comparatively very low taxes for an industrialized country, Switzerland is neither an entrepreneurship hub – its innovation is driven by large and well-established companies, not startups – nor is it known for easy access to venture capital or IPOs. To an extent, it is fair to call the Swiss model of innovation establishment-driven. As such, it is extremely successful and sustainable by any standard.

On the other hand, Israel, a.k.a. Start-Up Nation, a.k.a. Silicon Wadi, a country roughly the size of New Jersey, is world champion at churning out technology at a feverish pace with far more limited resources and infrastructure. It is also world champion in R&D expenditures, clocking 4.3 percent of GDP, almost half of it from foreign investors. It has been called the best country to found a startup and the worst to keep it alive. But it is also the world’s leading model for public-private partnership in innovation.

Take Yissum, a technology transfer vehicle of Hebrew University. Established 1964, it is a wholly-owned subsidiary of the university, it accounts for almost 10,000 patents and 120 spin-offs. When a patent is registered, the inventors / scientists take 40 percent of patent revenue; 20 percent goes to their lab, and 40 percent to the university. This covers one-tenth of Hebrew University’s research budget. Long-term research cooperation exists with several multinational enterprises that established over 320 R&D centers. The downside of market orientation is equally obvious: applied research is prioritized over foundational research, further exacerbating its lag.

More than 8000 startups were created in Israel in the last decade. They employ 500,000 individuals. Just in 2016, some 1,400 new companies were founded. Even though, like everywhere else, the vast majority of startups does not survive, there are at any given time some 6,000 operational startups. The country has no choice but to try new things. Its domestic market is too small, and a foothold in foreign markets requires products the consumer has not realized a need for yet.

Venture Capital finance also follows its own model (although only $4.8 bn was raised in 2016). Terra Venture Partners, a private business development fund, operates in an environment Silicon Valley can only dream of: every shekel invested by the fund will be matched with six shekels by the state.

The government’s Israel Innovation Authority, by now an agency independent of the Ministry of Economics, funds 2,000 projects from all walks of life, with the exception of foundational and military research. Select projects are funded with up to 85 percent of their budget requirements, while university research with up to 90 percent. Israel Innovation Authority recovers 35 percent of its investments on average. If it were more, the agency would conclude that they took insufficient risk. But if a funded company is sold abroad, it must repay three times the amount received.

Not surprisingly, and very much based on the factors described above, the international advantage of Switzerland lay in health and material sciences, while Israel has become a focal point in data sciences, alternative energy, and natural resource substitution.

2017-07-01

The Short-lived Fallacy of Biometrics

When MasterCard introduced last year in twelve countries a feature identifying the payor via fingerprint scanner or selfie, it took one further step toward abandoning the immensely flawed concept of chosen passwords and PINs. Considering the deplorable state of imaginative solutions – the globally dominant password being 123456 and the most-used PIN being 1234 – the move seemed long overdue. Another contribution to security breakdowns is the mushrooming number of “different” password requirements no one can seriously be expected to remember, particularly in combination with a multitude of user names and ever-simplified “forgot password” functions.

HSBC has additionally enabled identification via voice recognition software that verifies some 100 unique speech characteristics such as speed, vocal traction, nasal tones and enough others that are said to work even when the user suffers from a cold. Wells Fargo and also a range of other banks enabled log-in via retina scan. Canadian start-up Nymi authenticates individuals through their pulse taken by a wearable prototype interacting with near field communication terminals.

While banks and fintechs may be right in concluding that this increases safety beyond passwords, there is no question that biometrics will inaugurate just another round in the perpetual arms race between security and illegitimate access.

Its limitations are increasingly obvious.

At the 2014 Chaos Communication Congress, hacker Starbug a.k.a. Jan Krissler showed how a picture taken with a single lens reflex camera of German minister of defense Ursula von der Leyen from a distance of three meters sufficed to reproduce her thumb print with Verifinger, a graphics software tool.

Research at Michigan State University developed a simple method to print pictures of fingerprints on a pedestrian printer with a resolution sufficient to fool fingerprint readers, unlocking smartphones and completing transactions via Apple Pay.

The ACLU has showed that selfie scans heavily depend on lighting conditions and may be influenced by changes in hairdo, ageing or weight changes.

Background noises and recording issues may interfere with identification by voice recognition.

When hackers accessed 40,000 accounts at British Tesco Bank and withdrew funds from 9,000 accounts, the monetary loss of £2.5 million was the least of it and highlighted the consequences of compromised biometric databases: while one may change passwords with a minimum of fuss, not quite the same can be said for getting a new fingerprint or face – here, the method of identification is compromised, potentially permanently. Turns out that biometrics may be worse than passwords - and hackers are still notoriously at least one step ahead of the game.

2017-06-17

Bow-tying DNA

Building on research by CalTech’s Paul W.K. Rothemund, it has been known for considerable time that DNA is folded up to form nanoscale shapes and patterns. The restiform genetic makeup of mammals is folded in bows in order which enables even distant areas to form contacts. It is read by dragging it through cohesin, a ring of proteins, until a stopper is reached. It has been known for some time already that enhancers amplify and activate genes that are positioned far away on the thread of DNA. This can most probably be explained through a precise process of folding back DNA so that enhancers come in contact with the “right” genes. By a commonly accepted hypothesis, this folding back happens as a ring of cohesin molecules surrounds the DNA thread at a random location. It is pulled through the ring until it reaches a “thick” spot that acts as a “stopper.” This thickening is caused by a protein named CTCF that attached itself to the DNA, targeting distant DNA sections for direct contact.

Experiments with mural cells showed that cohesin does indeed move along the DNA thread over long distances with transcription (or “reading” DNA information) acting as an engine. This process is likely powered by RNA polymerase, an enzyme that carries out transcription, probably not least to be able to “read” genetic information in the first place.

2017-05-01

Molecular-sized ball bearings

It turns out that functionality we know in everyday life also exists at a molecular-sized nano-level. Measurements by mass spectrometers had shown some time ago that precisely thirteen boron atoms can enter into a particularly stable form called a magic cluster. It has a flat structure and consists of two concentric rings: an inner ring consisting of three boron atoms and an outer ring consisting of ten boron atoms.

Some years ago, theoretical chemist Thomas Heine, now at the University of Leipzig, predicted that these two rings would permit almost frictionless distortion against each other without affecting the overall stability of the molecule in any ways. This results in a molecular-sized ball bearing that permits a virtually frictionless counter-rotating movement of the atomic rings.

Proving this prediction was not free of challenges and could be done only through spectroscopic measurements with a free electron laser at Fritz-Haber-Institute in Berlin. Commercially available lasers are unsuitable for this proof because it requires extremely intense laser radiation in a narrowly defined wavelength range. By measuring the infrared spectrum and accompanying calculations of quantum mechanics it was possible to prove the functionality of the molecular ball bearing.

This is one of the first practical indications that quantum effects may be put to targeted use as part of the functionality of molecular systems. Despite the fact that applications are still in the distant future, their promise and potential is immense. This comes as no surprise that the 2016 Nobel Prize in Chemistry was awarded for discoveries in the area of molecular machines.