Pages

2013-11-26

Cyber Warfare as a Superficially Tempting Low-Level Engagement Strategy

At first sight, and against certain counterparties, cyber warfare has appeared and proved to be a phenomenally low-cost and low-risk tool of adversarial foreign policy. But while questions increasingly arise about exposure to reciprocal risk to the most heavily digitized knowledge- and data-based economy and society, it has become clear that the genie will never again return into its bottle. Strategic, legal, and political questions will not be dodged much longer. The very advantages of cyber warfare may easily and all too quickly be turned against a first mover, especially one as vulnerable as a highly digitized industrial state. Its use for asymmetric warfare increases attractiveness to non-state actors. And one of its arguably greatest potential, the disruption of enemy economic functionality by disruption of payment systems has regularly been vetoed in the interest of the integrity of the global system. It may appear that the philosophy underlying the Nuclear Non-Proliferation Treaty as well as treaties banning use of chemical and biological weapons may provide even stronger rationales in an understanding to ensure mutual non-aggression by digital electronic means between major and even mid-size powers.

2013-10-20

The Bionic Man – Paradigm of the Great Taboo?

I read a most interesting narrative of the world’s first “bionic man” assembled from prosthetic body parts and artificial organs donated by laboratories around the world. While it is interesting to note that the world’s first “bionic man” weighs in at a cost of almost $1 million to build, anyone who has taken a look at the cost of building a car of its spare parts purchased at retail knows that this amount is actually an incredible bargain, explicable only by the fact that the “bionic man” is by no means complete but represents only two-thirds of the entirety of the anatomy of living humans. Still, it does contain an artificial heart, blood, lungs, windpipe, pancreas, spleen, kidney and a circulatory system, some brain functions, and prosthetic implant parts of sensory organs. There is little doubt that the number of human “spare parts” will in time grow to approach the original asymptotically – likely in direct proportionality to geometrically rising cost. And let’s not forget that the component artificial body parts were, well, donated. It was their cost of assembly that amounted to a cool million.

After some breathless accounts of technological details, the anecdote reaches the intangible points of the experiment:

“The bionic man brings up some ethical and philosophical questions: Does creating something so humanlike threaten notions of what it means to be human? What amount of body enhancement is acceptable? And is it wrong that only some people have access to these life-extending technologies?

The access issue is especially troublesome, [Roboticist Rich Walker of Shadow Robot Co. in England] said. ‘The preservation of life and quality of life has become basically a technical question and an economic question.’”

True. But is it possible that Walker overlooked the fact that life extension has always been a technical and economic question?

2013-09-16

Crowdfunding: Removing the power of organized finance

It is a fairly fascinating proposition to have projects funded not by a commercial lender but by prospective customers or by the general public. Crowdfunding collects equity from stakeholders of all sorts in the global village. It brings manufacturers and consumers closer. Almost any purpose can be served – and financed: innovative e-products, free software, movies, records, video games, scholarly research, social projects, journalistic projects, political campaigns, start-up funding, disaster relief and other charities, to name just a few.

Crowdfunding platforms are mushrooming, differentiated by project categories, target groups, or geographical regions. One platform does not fit all: some finance creative projects, others fundamental research, another category offers social causes.  Kickstarter.com, for example, one of over 500 known crowdfunding platforms, has worked well for projects launched from the U.S., the U.K. and Canada, such as the digital watch Pebble that interacts with iPhones and Android phones: it sought to raise $100,000 and got over $10,200,000.

Other successful crowdfundings include Amanda Palmer (sought to raise $100,000 funding for an album, raised $1,200,000), Spike Lee (raised $1,250,000 for a movie), or Chris Roberts (raised $2,000,000 to revive space simulation in videogames). Chris Roberts and Cloud Imperium Games also hold the record now pegged at over $17.6 million.

As a method, crowd funding is still in its nascent stage and virtually unregulated in many if not most jurisdictions, with the early exception of the U.S. Jumpstart Our Business Startups Act (JOBS Act) that requires platforms for investment crowdfunding to register as broker-dealers with the SEC. Therefore it is protected only by conventional securities laws, such as the Howey test (Securities and Exchange Commission v. W. J. Howey Co., 328 U.S. 293 (1946)) and general concepts of fraud. Intellectual property protection for ideas disclosed to the public in the course of a crowdfunding effort is lacking as well, unless patent, copyright and trademark applications are filed early. The World Intellectual Property Organisation (WIPO) promotes a novel standard of protection embedded in creative works under the concept of a “creative bar code.”

To avoid the staggering and often suffocating cost of compliance with securities regulation, crowdfunding that does not serve conventional investment purposes needs to devote particular attention to devising its reward structure not to fall under the Howey test for an “expectation of profits” that “depends solely on the efforts” of a promoter or third party. This is relatively easy in the case of charitable funding or rewards through the enjoyment of completed creative artwork such as motion pictures, video games or records.

Typical of crowdfunding are intense media campaigns, typically through social media channels. Communication with backers (investors) including response to their feedback and a constant stream of updates is essential. The generally small amount of funds provided per backer results in a significant increase in communication intensity per dollar raised. Crowdfunding is service-intensive because it is not based on ratings and purely passive major investors content with quarterly or annual reports are rare. The learning curve with successful projects is often described as steep.

However, the operating model is just the latest high-tech based application of a classic: the “low-margin, high volume” idea. Although the high volume aspect is still debatable by comparison with institutionally-raised finance, it is not low with regard to the needs of projects for which it has been tested thus far. This is largely because it has primarily been used for purposes that had previously not been thought of as “bankable” in the first place or that could not be expected to provide an adequate monetary reward.

One of the important advantages of crowdfunding may be its lack of need for investor protection, since contributions can often be structured in ways different from traditional equity investments whenever rewards can be demonstrated to avoid the Howey test. While most backers would disagree that their contributions, often borne by enthusiasm or emotional interest, are a fool’s tax paid with no expectation of reward, they are often, and indeed typically, made in amounts the loss of which they may safely be expected to be able to absorb, just like in the case of mass charitable donations.


But precisely because typical projects are often best described as innovative or cutting-edge, their traditional profitability almost always is in a gray zone. Relative lack of regulatory burdens may be key to getting such projects funded in the first place, and it will be important to avoid the stifling burden of “protective safeguards” that have been designed to stop fraud and abuse but that also hobble the viability of many projects. Especially outside crowdfunded equity investment, this classical dilemma may be sidestepped by relying on the speed by which news of mishaps travel in the internet community and by a combination of user sophistication and improvement of diligence by digestible losses. It is the price to be paid by people who wish to “fund what matters to you” outside conventional profit expectations as their only variable to maximize. It expands democracy in the financial sector by removing the power of large financial institutions, industry associations and government sponsors to decide what does or does not get funded. It will be up to consumers in their capacity as voters to determine whether they will be deprived of this tool under the pretext of “investor protection” for the sole benefit of the monopoly held traditionally by institutional finance – turning business into a domain reserved for entities “too large to fail,” a code word for being a taxpayer-protected part of the establishment.

2013-08-28

Crowdsourcing, Scientific Method and Intellectual Property


The consequences of digital networking for our ways and means of processing complex information are only beginning to emerge. Yet one can see already with great clarity that digital networking will not only change the type of problems that may be addressed, but also method and credit for it. Concepts of intellectual property will never be the same. By acknowledging the substantial and often critical contribution of others to the evolution of thoughts, ideas, questions and solutions, we are led to depart from a “star system” that glorifies individual genius and contribution toward a more realistic acknowledgment of multiple credits for a potentially vast number of contributors, without whom certain problems may not find an answer without engaging vastly greater resources of time and funding.

In science, crowdsourcing means to out-source research and development tasks to a mass of voluntary but sometimes unaware users, in some instances through “games” that superficially serve an entertainment purpose. Crowdsourcing works particularly well if scientific knowledge can be transferred to an application in so elegant a manner that users need not understand it.

With crowdsourcing, individual leadership and ingenuity takes on a different dimension and purpose: turning into more of a managerial task, the emphasis shifts to finding a way to harness intellectual resources of the masses and finding a quid-pro-quo that permits accessing them. In a digitally networked world, it reflects “open innovation,” a changed view of the scientific process, one that anticipates the participation of as many individuals as possible in processes of research and development as an increasingly natural form of an efficient division of labor. This is especially true with regard to superficially tedious routine work. Zooniverse is a good example: it enables laymen to analyze cell tissue for cancer research, categorize galaxies, or sort through weather records in 19th century marine log entries for purposes of climate research. Sometimes tasks outsourced to the masses of users are rewarded monetarily, for example by Amazon Mechanical Turk or Crowdflower.

Relying on the contributions of many is hardly new in human endeavors: the pyramids, the Panama Canal or Neil Armstrong’s moon walk each engaged a collaborative effort of approximately 100,000 individuals. Crowdsourcing, that is relying on the intellectual resources of internet users, may enable the involvement of 100 million or more. Duolingo, a platform offering language learning resources in English, French, German, Italian, Spanish and Portuguese, is free but has an ulterior motive: by practicing, students “translate the Internet,” especially Wikipedia, into the languages they aim to discover.

In 2008, David Baker at the University of Washington created a three-dimensional “puzzle” named Foldit, a “game with a purpose,” which is to fold proteins in a spatial dimension,  a three-dimensional "asks to a mass of voluntary but sometimes unaware usersa task that requires immense computing resources but somehow comes a lot easier to humans. At least to some of them: among 100,000 Foldit aficionados worldwide playing regularly, some particular talents turned out to be 13 years old and intuitively performing tasks pushing supercomputers to their limits. Fifty years of molecular biology are packed into Foldit – but users only need to turn their models in a variety of directions on their computer screens.

Protein structures may be conceived as networks, and it is conceivable that users could be tasked with changing protein networks in a way that strips them of their characteristics in cancerous cells, thereby inaugurating a breakthrough in cancer therapy. If this concept is similarly successful as Foldit, it could result in a 100 times greater output.

There is no shortage of “citizen science” projects: MIT seeks to enable users to “map” the brain through Eyewire. The University of Munich has created Artigo, which creates a competition between users to provide keywords for cataloging archived works of art. With Geo-wiki, the International Institute for Applied Systems Analysis addressed a notorious deficiency in automated analysis of aerial photographs for the classification of land in connection with potential use for ethanol production. This in turn inspired the creation of computer games designed to draw a broader user base. Recaptcha, by now acquired by Google, is a method based on the reverse application of captcha, the technology used to authenticate human users online by identifying distorted signs or words. Recaptcha has been designed by Luis von Ahn at Carnegie Mellon University to harness involuntarily the resources of 750 million computer users world-wide to digitize annually 2.5 million books that cannot be machine-read.

Game design needs to be based on reward and recognition of performance. To date, this is typically achieved when different users arrive at identical solutions. Needless to say, this creates a risk of rewarding congruent nonsense, an outcome for which non-trivial solutions have yet to be designed. In spite of such shortcomings, game results can still improve data quality.

It is easily imaginable that Open Innovation will eventually require a revolutionary change in the protection and reward of intellectual property thus created. Some of the difficulties this presents is the relative anonymity of the web, the small size of individual contribution, and the random, haphazard, or playful nature of at least some, if not most of the contributions. But similar challenges have already been resolved in the design of the class action system: there, benefits to individual plaintiffs are also typically too small and negligible to justify pursuit by traditional methods, and the reward largely accrues to the organizers of the effort. But the social purpose, namely the disgorgement of profits of a mass tortfeasor, may well be compared to the creation of another social good in the form of R&D resulting from, or at least significantly augmented by, a large number of only marginally interested contributors.

Collective reasoning and collaborative creativity may yet ring in an era of division of labor and profit by a mass collective that is organized not along political ideology, but around the opportunities and incentives created by networked technology and pooled human talent.

2013-07-06

Why would you (still) believe Wikipedia?

And so it would seem that the time has come to reverse myself on my seal of approval for Wikipedia as a source of authoritative knowledge: Wikipedia’s article on "The Bicholim Conflict" of 1640-41, also referred to as “The Goan War,” has been shown to be a hoax. After five and a half years of misleading its readers, the article was taken down by the editors who had been unable to source-cite it. Because, surprisingly enough, Wikipedia is, in fact, source-cited. And it may take as long as five years – or indeed forever – for its (unpaid and anonymous) editors to get around to conducting proper factual checks on articles posted by volunteers. This process would also explain the nearly four-year long survival of a fictitious Indonesian island, Bunaka, and a digital lifespan of eight years and one month of Gaius Flavius Antoninus, a supposed conspirator in the assassination of Julius Caesar.

But do incidents like these really discredit a vast depository of free knowledge that is Wikipedia? (Yes, it is free, dear Britannica). Is peer-reviewed and professionally edited information always reliable? The most prominent counterexample is the prestigious journal Science. On at least two occasions, Science had to retract already published, thoroughly vetted and peer-reviewed articles. These retractions came after receiving scrutiny by the ultimate peer reviewer - the scientific community – that had called into question the groundbreaking research on which the articles reported. One of the articles linked they ome after thnd peer  Chronic Fatigue Syndrome to a xenotropic murine leukemia virus, promising a path to treatment for millions of patients. Science withdrew the article after the research results cited therein could not be replicated, and the authors partially retracted some of their findings. Another case dealt with a report of the first human embryonic stem cells created using a novel cloning technique called somatic cell nuclear transfer (SCNT). Following allegations of fraud, a committee of scientists was called upon to verify the substantive findings, and the article was retracted, while the article’s author and his lead researchers were fired from Seoul National University. This is not the end of the story, though – three years later, another article in Cell Stem Cell partly exonerated the condemned author by showing that he was not a fraud, but had been simply wrong. While the author of the retracted article did not accomplish what he claimed – to create stem cells via SCNT – he did achieve another significant breakthrough without even realizing it: he had created human stem cells via parthenogenesis. Even if his real discovery creates a greater promise for finding a cure for diseases such as Parkinson’s, the human cost of this comedy of errors cannot possibly be overlooked – after all, the names of scientists involved in the “scandal” are now surrounded by ignominy, their scientific appointments terminated, and even those who did unearth their actual discovery later find themselves hesitant to side with the condemned authors.

It is true that Wikipedia lacks a rigorous peer review system. In fact, many of its articles that fall through the cracks of its public verification process do so because they are written on topics that are not important enough to attract sufficient attention. How many people are likely to look up on Wikipedia the name of an island that does not even exist? But the “proper” academic peer review process is not infallible or without fault, either. Just the fact that something is printed – be it on paper or on a computer screen – does not support a conclusion that it necessarily contains absolute truth. Still, if a source is consistently reliable, we may safely assume it will be so also in the instant case. Hence, the judicial standard of “general acceptance” – the one extensively discussed in Frye v. United States, 293 F. 1013 (D.C. Cir. 1923) and in Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993) – still stands with regard to Wikipedia.

2013-06-02

Nano Hazards



Synthetic nanomaterials form part of a gigantic emerging market world-wide with expected growth rates of about 23 percent. Only a few years ago, nanomaterials were viewed as barely out of science fiction, with highly promising applications but also novel risks. To date, no labeling requirements exist that would alert consumers to potential near- or long term hazards to the environment, even though the European Union has a directive on cosmetic labeling that will enter into force in July 2013. Some nanomaterials cannot be degraded naturally or filtered and recycled by waste processing plants; some involve risks similar to asbestos, and others may facilitate development of bacterial resistance against their very antibacterial proprieties currently used in hospitals.  

Absent mandatory labeling and registration, consumers cannot determine today whether a product contains nanomaterials. While nanoparticle applications feature them typically bound in other compounds, those are hardly problematic and almost never pose health hazards. But the same cannot be said about production processes and waste disposal. Nanoparticles can be suspended in air, breathed in, and can enter the bloodstream. They can also penetrate various sensitive areas of the environment. Little is known about the dispersion, behavior, and chemical qualities of aging and disintegrating nanoparticles. The benefits of nanotechnology are seldom in dispute – but the question is how to assess and balance benefits and risks appropriately, so that the hygienic, protective, energy, weight, or physical advantages are not offset by unacceptable long-term environmental hazards.

2013-05-31

Nanobiotechnology: Pandora’s genie is pushing the cork



As a market, bionanotechnology is projected to grow world-wide by a rate of 28%. No further explanation is needed why the field is increasingly considered “the future of everything,” even if its potential for raising concerns is seldom overlooked and no FDA regulations exist to date.

As one can always safely assume with interdisciplinary areas, the father of all things is a dispute over terminology – in this case, the distinction, if any, between nanobiotechnology and bionanotechnology. Although enough ink is being spilled on that, it hardly matters: if nanobiotechnology, as a Lilliput version of biotechnology, takes concepts and fundamentals directly from nanotechnology to biotechnological use, bionanotechnology derives its concepts from mechanics, electricity, electronics, optics, and biology,  and relates structural and mechanistic analysis of molecular-level biological processes into specific synthetic applications of nanotechnology. Not a distinction without a difference, but it matters little due to one principal characteristic of all nanotechnology: at the molecular and submolecular scale, biochemistry, biophysics, biology, and all other forms of human inquiry converge.  Thus, multidisciplinarity is inherent.

How to prevent math from sinking in



U.S. industry groups and politicians keep ringing alarm bells dating back to the Sputnik days: we need more scientists! In fact, we need 10,000 more engineers each year, and 100,000 new STEM teachers! That, and higher scores on standardized tests in math and science, is supposed to ensure the country’s technologic and economic supremacy.

Well, research comes to the rescue to explain why we have so few science, technology, engineering, and mathematics (STEM) graduates.
  
First of all, math is really, really bad for some people. It has been shown to provoke sentiments of fear upon so much as showing people some math textbooks. The brain regions activated in those instances are the same that are responsible for the perception of physical pain. We surely cannot expose innocent children to such unpleasant experiences. And does it really matter that the brain regions associated with such pain are closer to those responding to fear of social rejection, than to those relating to fear of physical threat? Even if the response is simply a conditioning received in school by instilling fear to appear dumb to one’s peers and teachers, are we to abandon much-studied math anxiety as a proximate cause of innumeracy in our society?

2013-05-17

The case of synthetic biology

Few people are even aware what the concept synthetic biology represents, and yet it has already become a cutting-edge focus of major research efforts and teaching. Synthetic biology purports, in essence, to create useful creatures through engineering methods. It uses multi-purpose components taken from nature’s building blocks. Organisms not heretofore seen in nature may be capable of producing fuels, complex chemicals, or novel pharmaceuticals, but also computer circuits based on biological structures. Can we out-nature nature and surpass evolution? Impressive steps in that direction have been made already. Optimization and fine-tuning of naturally occurring enzymes no longer makes front-page news.

In fact, synthetic biology’s visions open Pandora’s box of unlimited possibilities pointing to the big bang of a multi-trillion-dollar industry: tailor-made bacteria that identify and destroy toxins, produce fuels formerly known as fossil, render planets like Mars or Venus suitable for human habitation, let tables and chairs grow out of the soil without needing humans to manufacture them from timber wood. The starting point of synthetic biology is the identification of “bio bricks” and a three-dimensional “DNA printer” transforming a customized genome sequence into a new and reprogrammed bacterium or complex organism. Harvard researcher George Church has already presented a prototype he called MAGE (Multiplex Automated Genome Engineering), a device that indeed translates relatively short DNA sequences via several intermediary steps into molecules that it inserts directly into monocellular organisms.

Our competition with nature may be won by sidestepping considerations that would play a central role in evolution, which is, after all, a contest for survival of the fittest.  But what if the effort required to produce a substance is in no relation to the survival of the individual of the species producing it? Natural organisms need to observe a balance of priorities and interests, but synthetically engineered life does not.

2013-05-16

Nature’s self-plagiarism rewards life forms by sidestepping evolution

Red algae, fruitfly, and humans may have more in common than expected.
 
Red algae are among the oldest manifestations of life. They have been around for well over a billion years and arguably will endure for as long as this planet will remain habitable for gene-based life. Galdieria sulphuraria, a stunningly successful variant of this extremophilic life form, has been found in the boiling-hot springs of Yellowstone National Park, or in the acidic waste water drains of mine shafts where they are exposed to extreme levels of heavy metals, as they have emerged in some of the highest saline concentrations on Earth. Its metabolism is extremely adaptable as well: at times, it “forages” by photosynthesis while at other times it devours a wide array of bacteria in its immediate vicinity, growing either photoautotrophically or heterotrophically on over fifty carbon sources.

How is this possible? Simple: Galdieria, a microbial eukaryote, has concluded that it is better to decline the arduous path of evolution just to reinvent the wheel. It plagiarizes.

Since nature does not run a patent office nor a copyright register, Galdieria figured out a simple way to extend to the successfully adapted elements of its environments the sincerest form of flattery: it copies them by way of horizontal gene transfer. By absorbing genes outside of sexual transmission and across species barriers, this form of algae has adopted at least five percent of its protein-coding genes from organisms in its environment. Analyses of the Galdieriagenome show that it has assumed achaea’s heat resistance while it took its resistance to heavy metals such as mercury and arsenic from bacteria that had developed transport proteins and enzymes. 

While Galdieria sulphuraria presents a fascinating example of unexpected effects of crossing the species barrier, other examples are even more relevant to us: a fragment of human DNA was found in Neisseria gonorrhoeae, the bacterium that causes one of the oldest human scourges of sexually transmitted diseases. Such a horizontal transfer of genes from the highly developed human life form to a bacterium constitutes a huge jump. Studies concluded that absorption of human DNA by Neisseria gonorrhoeae must have happened quite recently in evolution’s timeline. It also explains the bacterium’s high adaptability and its persistence throughout human history.

Genetic transfer can also work from lower to higher species. For example, genetic material of a bacterial parasite called wolbachia is found in 70 percent of the world's invertebrates. But there exists at least one species, the fruitfly Drosophila ananassae, that contains the entire genetic material of wolbachia and continues to replicate it as its own. Here, the genetic transfer clearly benefits the parasite, not its genetic host. 

In the human genome, as many as 223 of some 23,000 genes appear to have been acquired directly from bacteria through horizontal transfer through incidents such as bacterial infections. Those genes are present only in prokaryotes and in humans, having skipped entirely all intermediate life forms such as invertebrates. 

Recent studies questioned the idea of gene transfer between human and lower species. Analysis of a larger spectrum of non-human genomes suggested a reduction in the number of human genes serving as potential proofs of gene transfer to less than fifty, implying that even that remaining set can be disqualified with further research. However, considering the staggering scale of contamination of non-human species’ genomes in databases with DNA of scientists handling the samples, any such identification needs to be carefully screened for false positives. 

In view of the fact that a healthy human body contains trillions of microorganisms, ten times as many microorganisms as it contains human cells, and the number of present microbial genes at an overwhelming 3.3 million dwarfs the human genome's 23,000, it is conceivable that some horizontal gene transfer would indeed occur.  Current research mapping the human microbiome is considered of importance equal to the human genome project, since the 10,000 microbial species present in the human body, mostly in its gastro-intestinal tract, play a critical role in the very survival of the human species. Microbes are responsible for digesting our food, for producing vitamins and anti-inflammatories needed for our immune response, but they also need to be taken in consideration when devising treatments for human diseases, not only for those of a bacterial or viral nature. 

The next time we venture into the great outdoors and feel like we need to protect ourselves from the “dirt and contamination” ubiquitous outside of our aseptic homes, maybe we should consider that we are, in fact, an integral part of nature, and that its microbes are part of us in far more ways than one.

2013-04-30

Bloodless. Micro- and Nano-Surgery


Micro- and nanotechnology have started to revolutionize the one medical specialty that always held the closest of analogies to a mechanistic view of the human body: surgery.

The principles are readily obvious:
  1. The smaller the invasive mechanical tools, the smaller the required incision, the lower the need for anesthesia, and the lower the impact of overall operative trauma on the organism and the risk of infection.
  2. The smaller the invasive mechanical tools, the more likely it is that surgical (that is, mechanical) approaches can be relied on to perform the work previously entrusted to biochemical (that is, pharmaceutical) intervention with its inevitable, often extensive, and often numerous side effects, including the need to rely on complex and uncontrollable therapeutic mechanisms deemed “irrational” from a scientific perspective and generally subsumed under the notion of placebo effects.
  3. The smaller the invasive mechanical tools, the closer they come to the molecular level in the category of nanotechnology, the more the distinction between surgical and pharmacological intervention blurs. With one critical distinction, though: conventional wisdom has it that a surgical tool needs to remain under the surgeon’s control at all times whereas the molecules of any pharmacological substance by definition operate on a stochastic basis once released into the human body.
  4. The smaller the invasive mechanical tools, the lower or even inexistent the need for blood transfusions becomes – for patients who are Jehovah’s Witnesses it may be a religious tenet while for Muslims transfusions may be limited by religious concerns (such as the faith of the donor) but, aside from such considerations, procurement and management of adequate blood supply is not without substantial risks even in present times.
  5. Insertion of micro- and nano-technological mechanical tools into human vessels and body cavities entails certain challenges that can be summarized as issues of “command and control,” not unlike remote tactical direction exercised by a battlefield commander. This involves not only the direction of the tool with precision and accuracy but also fundamentally revolutionized imaging technology in order to maintain orientation, overview, and indeed overall systemic control over the operation.

It also does not surprise that technology developed for medical uses in small, tight spaces very often has multiple uses, sometimes even military, security and industrial ones.

With cardiovascular, cerebrovascular and ischemic disease but also cancer as the primary killers in the U.S. and worldwide accounting for about half of all deaths, the potential market for developing and advancing related technologies probably rivals the customer basis of the pharmaceutical industry. Even if expertise continues to become a lot more ubiquitous through telemedicine including remote surgery and videoconferencing, the cost of surgical intervention will most likely at most times and even in the very long term exceed the cost of a pharmaceutical solution – but not if the therapy requires maintenance drugs that need to be taken for extended periods or even a lifetime, or needs to be weighted by comparative effectiveness. Another factor that requires a paradigm shift in how medical care is reflected in its treatment by accounting standards is the increased promptness with which patients can go home and back to work.

Minimal procedures adapt to the size and location of the problem. Open heart surgery becomes necessary in far fewer instances and there is hope for similar developments in all locations of the human body that permit access through vessels or body cavities in some however indirect and roundabout way. But an extra hour or two of the surgical team’s time can shorten recovery periods by weeks or months, not to mention mortality risks attributable to distinguishable but still surgery-related factors such as infections, hemorrhaging, or delayed effects resulting from structural damage to tissue during surgery.

Methods based on catheters were originally developed in cardiology to address leaky heart valves, arrhythmias including atrial fibrillation (a major stroke risk), and atrial septal defects. Catheters are also the basis of balloon angioplasty and placement of drug-eluting and recently also dissoluble stents, while catheter-based zapping of nerves in the vicinity of the renal region found to be responsible for driving hypertension promises a cure for the condition. Lysis of blood clots in the brain thus far is based primarily on pharmacological methods but may eventually be assumed by nanotechnology.

The more technology advances, the clearer it becomes how many challenges remain unresolved: vascular plaques and blood clots call for maintenance of vascular walls and cardiac valves similar to that of pipe drains. With growing knowledge of dietary and environmental impact over time on our vascular system and recognition of the insufficiency of “diet and exercise,” the systemic nature of this problem and its effect on all organs becomes as obvious as the insufficiency of a solution based on pharmaceuticals alone. While medication may be effective in slowing down the process, it has rarely if ever been shown to reverse it, which would be necessary to begin speaking of a “cure“ in the sense of turning back the clock by a few decades – as would be necessary to create a significant impact on human life expectancy and quality of life.

Minimally invasive microsurgery and surgical nanotechnology are by no means limited to cardiac and vascular issues. Laparoscopic and endoscopic methods have taken over increasingly complex tasks in the abdominal cavity as well as along the entire digestive tract. Fallopian tubes and ovaries are now operated on under imaging magnification factors of 15-30x with heretofore unknown microthread material. Similar developments are taking over surgical interventions also in otorhinolaryngology and, where it is perhaps most noted by the public, in ophthalmology.

But arguably the most challenging frontier is neurosurgery. Interventions in the central nervous system as well as at peripheral nerves become increasingly possible as our tools begin to resemble a fine brush more than a sledgehammer. Operating inside the intact skull and in the extremely tricky anatomy of the spine requires miniaturization of operating tools as well as miniaturization of imaging, remote guidance, and drainage systems. This becomes increasingly accessible with continuing advances in material science, microfiber-optics and IT-supported imaging and control systems. By reducing the size of the solution to the size of the problem, surgical intervention may live up to its original mechanistic view of the human body that has been denied throughout so many centuries by mystics and spiritualists. And yet, in the molecular dimension, the mechanistic view of the body may yet surprise us with a truly spectacular renaissance. 

The Death of Science

What happens when you make a Christian Scientist the chairman of the Committee on Science, Space and Technology of the House of Representatives? Well, he might just start believing that he really is a scientist, and is qualified to review the scientific peer review process. Strike that: to overhaul the scientific review process in its entirety.

Remember that guy, Stalin? He was on to something. The state was not going to sponsor, or even tolerate, imperialistic bourgeois free-roaming research. Fast forward to 2013: Stalin is gone, the Soviet Union is gone, Lysenkoism is gone (look up Lysenkoism, or, better yet, suppressed research in the Soviet Union). Now the US is allegedly the sole remaining superpower, and it makes it plenty clear: The US is not going to sponsor any politically incorrect or otherwise subversive and not directly “useful” research.

The first incredible step happened as recently as in March 2013. Among the 600-something pages of legislation keeping the government from shutting down was a neatly snuck-in amendment saving “the American people”– gasp! – $11 million (that’s less than the cost of one good old F-16, or 8% of one F-35). This brilliant idea suppressed not only ‘wasteful and inessential spending.’ It also took care of politically incorrect intellectuals. In particular, the bill eliminated the source of 95% of funding for political science studies, i.e., the funding by the National Science Foundation, unless such studies are deemed by the NSF director to be “relevant to national security or U.S. economic interests.” Out with unproductive research, make yourselves useful and contribute to the rising glory of The World’s Superpower!

It appeared promising enough as a precedent, and so, only a month later, we have another brilliant proposal: why only political science? Why not subject National Science Foundation’s entire $6.9 billion budget to a test of political and economic usefulness? How about having “every NSF grant application include a statement of how the research, if funded, ‘would directly benefit the American people’”? The Committee on Science, Space and Technology would surely be happy to verify whether this basic criterion has been properly applied, just like its chairman demanded records of the peer review process on such useless and questionable research as “The International Criminal Court and the Pursuit of Justice.” Did putting a man on the moon somehow “benefit the American people”?

My math friends will certainly not be happy. How do you justify with utility a line of research that, by definition, is abstract and ‘pure,’ as opposed to ‘applied’? Are they to change their specialties altogether, so they can justify their research with military or computer security applications? Are they to abandon entire areas of mathematics that, by any conceivable stretch of one’s imagination, cannot be said at the time it is developed – or even ever – to “directly benefit the American people”? And just how much will all this social engineering of scientific research save? Current NSF awards in algebra and number theory total some $111 million (much less than one F-22 at $143 million). Geometric analysis costs the NSF $72 million, and topology, a ground-breaking area of pure mathematics since the late nineteenth century, a mere $66 million. By comparison, the F-35 program costs $396 billion (that's billion, not million), with an additional low estimate of $1.1 trillion in maintenance and servicing costs. It is seven years behind schedule and 70% over cost estimate. A single F-35, of which several may be expected to crash during testing, training, or accidents over time, is expected to “directly benefit the American people” to the tune of an out-of-pocket price tag of $137 million.

If Congress is so concerned with not ‘wasting money’ on research, how about extracting dollars where they are made by relying on purportedly “not directly useful” research? In particular, academic publishers charge university libraries exorbitant prices, up to $40,000 per journal, so academics may gain actual access to the very same research that the NSF and universities originally sponsored. This research is paid for by grants and universities. The editors and peer reviewers are certainly not paid by the journals – they are considered volunteers, honored to serve science. Currently, even formatting is done by authors and editors. So where exactly is the investment of publishers such as Elsevier? Their entire ‘investment’ is spent on paper and distribution. And, in the case of online access, server space. Harvard announced last year that it can no longer afford to pay extortionate prices for scholarly journals costing its libraries $3.5 million a year. Considering that even small university libraries would still have to spend hundreds of thousands of dollars a year to stay abreast of scientific developments, Congress could recover at least part of, if not more than, the money dedicated to research today by tapping into this gold mine that is currently virtually monopolized by a handful of commercial publishers whose substantive contribution to the scientific process and its funding is exactly zero.

And such an obvious and logical move by Congress might actually turn American science back from its way to a hospice. The day this country ceases to be the world's leading producer by a mile of intellectual property and scholarship is the day it will effectively cease to be a superpower. One might envision a lively, controversial discussion with the populist congressional budget-cutters on how that „directly benefits the American people.” America is built on leadership by ideas, after all. Research - even superficially ‘useless’ fundamental research - is utterly indispensable for attracting top talent and thus for obtaining top results. It is a kind of a ‘trickle-down economics’ where talent and results from fundamental and ‘useless’ research (such as pure mathematics!) eventually find uses that change the world and how we see it. One cannot expect a scientist applying for a grant to present a clear and convincing view of the utility of his research before the discovery is made, years and additional grants down the road. Ignoring this reality surely meets one of the many definitions of insanity.

2013-04-26

Will Separatist Secession Strengthen Europe by Tearing History Apart?

Lest we forget that the last two millennia of history along with dozens of languages and dialects have bequeathed on virtually every remote corner of Europe at least theoretical desires for autonomy and independence, the process of fermentation seeking to experiment with “regionalist” attacks on the faits accompli of the nation state has reached new heights. The European Union, Nobel Peace laureate of 2012, is currently faced with votes on no less than four attempts at an orderly secession: in Scotland, Flanders, Catalonia, and in the Basque Country. And there are many more to come if any of those succeed. None of the regions currently aspiring to statehood have expressed any desire to leave the EU or to become a tax haven. Quite the contrary, all aspire to renewed membership and Scotland’s movement now even proposes to join the Euro. While the Scottish Independence Movement is supported by a minority only, a ¾ supermajority backs independence in Catalonia, and in the Basque Country the very recent vote of October 21, 2012 produced a verifiable and recognized 64% separatist majority – almost ⅔.

2013-04-25

Wittgenstein, Modeling and the Notion of Logical Space

Many attempts at understanding Wittgenstein climax in the wistful prayer that he had himself observed the closing proposition of his Tractatus logico-philosophicus: “Whereof one cannot speak, thereof one must be silent.” And indeed, he never published any other item in his lifetime. Deliberately speaking about logical space without the aid of formulaic language tempts one to reminisce about that mantra. In the terse, minimalistic language of the Tractatus, the facts in logical space are the world. In his ontology, the world is determined by the facts, and by these being all the facts. For the totality of facts determines both what is the case, and also all that is not the case. Mindful that “the world is all that is the case,” “the totality of facts, not of things,” Wittgenstein had entered upon the notion of logical space by reading Bertrand Russell, Georg Cantor and Gottlob Frege. It does not play a major role in his Tractatus but it does conceptually: logical space is, of course, a notion from set theory, an analogy to physical space modeled on Boltzmann’s and Hertz’ idea of a multi-dimensional space of physical possibility, comprising any system of relations that have the same or similar logical properties. The logical space of the Tractatus is nothing other than the set of all potential worlds, of which the real world is merely an element, being the only set of facts in logical space the elements of which are, without exception, facts. Hence logical space is infinite. Logical form, then, represents the possible within logical space. Wittgenstein’s method is that of logical atomism where all possible statements about a complex object consisting of parts can be reduced to the sum of statements about its parts.

2013-03-07

Imaginary and Real Limits of Science and Cognition

Two books, William Byers’ How Mathematicians Think and John D. Barrow’s Impossibility, at a first glance seem to have nothing in common: the former deals with the process of creating mathematics, while the latter discusses the outer limits of science. But closer inspection quickly reveals that both works are eerily similar both in content and ideas, and while their authors seem to disagree on particular points of their respective exposés (such as the ordered relation between mathematics and logic), neither seems to present a particularly objective or dispassionate outlook on science or mathematics itself.

The purported goal of How Mathematicians Think is to acquaint the reader with fundamental creative processes of mathematics, in particular those based on ambiguity, contradiction, and paradox. But the increasingly technical approach of the book begs the question about its intended audience: mathematicians will find the content trivial and self-evident – they need not be reminded of rather basic mathematical concepts, nor be told how they do what they do – while non-mathematicians will find the text abstruse and forbidding, and so may never really learn what the author really tried to convey – that is, the notion of mathematics as a creative albeit formal and rigorous art.

Impossibility focuses more on science in general (including the social sciences); yet, unsurprisingly, mathematics still remains the focus of the author, John Barrow, a Cambridge mathematician himself. Here, too, the reader is faced with a plethora of scientific and mathematical curiosa. Many of them are identical to those used in Byers’ book, but, while unquestionably interesting, its wealth of graphs, pictures, theorems and equations still begs an answer to the classic question: “And the point is?”

Both authors tried for a book about mathematics, or about science, without imposing much linear structure on their presentation. After shortly presenting the effect of basic concepts of Platonism and constructivism on the philosophy of mathematics, Byers at long last discloses his real if somewhat anticlimactic purpose: it is his spiritual quest for the school of Zen, wherein he tries to employ introspective musings on the nature of mathematics to fulfill his transcendental yearnings for spiritual completeness. Barrow, after disposing of the relevance of religion in science and mathematics, addresses himself to another of the speculative arts: philosophy.

Having understood the objectives of both authors set forth on the first pages of their introductions, the reader is puzzled by pages upon pages of prolix elaborations, vividly and frequently interspersed by refreshing intermezzi of fun mathematical factoids that make the reading feel like strolling through a collection of curiously interesting oddities.

Sadly, science has long been misunderstood as one of the least creative fields of human activity – the popular image of a scientist portrays a person continuously engaged in trial and error, over and over, until the moment of random discovery. And yet, without creative thinking no discovery could become possible, no secret of nature can be unearthed. Encountering a book that deals not as much with science as it does with the philosophical aspects of science must be disconcerting for a reader unfamiliar with the subject matter. Still, such books provide a valuable glimpse into the functioning of a scientific mind, an introduction to the kind of big questions that researchers may eventually reach.

And yet the approaches of Byers and Barrow both seem to be missing something centrally important to science and mathematics: dispassionate objectivity, uninfluenced by religious, spiritual, ideological or philosophical ideas that have nothing to do with science or mathematics itself, but have everything to do with an individual need for soul-searching and for establishing a value structure. Whether taking refuge in the dicta of philosophers or in religious or spiritual traditions, a scientist does not contribute to the knowledge pool as he purports to do, but merely limits his own potential by allowing himself to be weakened by a quest for higher purpose and moral guidance that appears to enable him to “dare to contribute to Pandora’s box of human knowledge.”  Experience shows, however, that ‘thinkers’ or ‘gurus’ are among the worst compasses that may guide a scientist, since they lack both scientific understanding and objectivity. And aren’t thinkers almost universally known to be more apt to devoting themselves to the creation of a faithful base of followers than to helping individuals succeed and potentially surpass their erstwhile teachers? Ideally and in the abstract, the primary objective of scientists is to further science (without, of course, entirely neglecting to make a name for themselves in the process). By contrast, the purpose of a guru or philosopher is to shape the world according to his individual convictions and beliefs (without, again, neglecting to make a name for himself). This inherent conflict of interests is all too readily apparent, and while it is only natural that a scientist’s human quality might feel an occasional need to seek spiritual or philosophical guidance, such crutches for the mind and soul should never be mistaken for scientific guidance, lest we obtain a generation of theology-bound scientists who either end up ostracized if not burned at the stake, personally or vicariously through their books, or ready to close their eyes to evidence not in line with the dominant, and therefore “orthodox” ideology. That was generally the seemingly eternal status quo of science before the Renaissance (but in some areas also later until the 19th century) under various ideological and/or religious regimes of “truth in certitudes.”

Those capable enough to push known outer limits of science usually have no time and resources to speculate and pontificate about “definitive” limits. This is best left to those with few ideas about science in the first place, such as philosophers and other intellectual speculators. The two books discussed here are an interesting exception to that common observation: both were written by practicing mathematicians who, for one reason or another, decided to stop on their path of discovery, look around, and speculate where it may eventually take them and society as a whole. Evidently, the authors saw valid reasons to take time off from what they are gifted enough to pursue as a day job for the sake of engaging in a much less demanding occupation. But whatever both authors’ personal needs, their results should by no means be confused with authoritative findings of actual scientific exploration. Newton, for one, was right and visionary in a great many areas – but his opinions on alchemy should be taken, politely, with a huge grain of salt. Even mathematicians expounding on an emerging “philosophy of mathematics” are, if truth be told, not necessarily much of an authority. Unless he is one of those rare giants of thought, say, Leibniz or Russell, someone writing about philosophy of science is frequently either not enough of a philosopher or not enough of a scientist to convince or at least intrigue a discerning reader. Attention-grabbing catchy and conclusory statements, like Emil Du Bois-Reymond’s famous “ignoramus et ignorabimus,” are much too often just plainly wrong.

The debate comes down to intellectual clarity and honesty about disclosing purpose and method, and about frequently reviewing one’s adherence to both. Epistemological concepts of truth, belief and understanding are critical to cognitive success but they yield only partial results for the analysis of limits to knowledge and discovery. Still, one might argue, much to the chagrin of theorists, that those limits will actually remain congruent with the limits of randomness and accident itself, so long as accidental discovery and recombination of existing knowledge resulting in new discoveries remain possible. To date, no meaningful reason has been presented in the abstract as to why limits other than temporary and technology-dependent ones should exist and why human capacity of cognition should be limited by anything other than by the capacity and processing power limitations of the human brain – and its man-made extensions.

Conceivably, theoretical limitations may exist in mathematics where the finite life span of the human body would require more time for acquiring the methodological mathematical tools and skills of their use requisite for pushing the limits of the common body of knowledge further out. It is easy to imagine that this might eventually apply even to very narrow and specific research topics. Yet, at the same time, it is overwhelmingly likely that the same state of cognitive evolution would not be limited to mathematics alone but also yield equally improved technological support for information processing and analysis. That would once more level the playing field, rendering again more or less constant and manageable the distance one generation may have to bridge from the legacy of knowledge handed down by their forefathers to the significant discoveries of their own acquired in a however extended lifetime.

Another scenario, albeit one whose potential implications for the human race cannot be adequately assessed beyond imaginative science fiction, involves delegating mathematical research tasks to artificial intelligence under some mechanism of joint control. This may, in fact, represent the one true limit of science because it is inconceivable that such a mechanism would not eventually fail as power to control the process is usurped once the genie leaves the bottle at a point of greater-than-human intelligence known as a technological singularity, never to return. Such a moment would constitute an intellectual event horizon beyond which events can no longer be predicted or understood.