Saturday, November 28, 2009

ALH 84001 Discovery or Diversion ?

Nasa is to announce on Monday the 30th November "evidence of life on Mars" by reanalysis of the Antarctic Meteor ALH 84001.

Wiki gives us a general outline.

Allan Hills 84001 (commonly abbreviated ALH 84001[1]) is a meteorite that was found in Allan Hills, Antarctica on December 27, 1984 by a team of US meteorite hunters from the ANSMET project. Like other members of the group of SNCs (shergottite, nakhlite, chassignite), ALH 84001 is thought to be from Mars. On discovery, its mass was 1.93 kg. It made its way into headlines worldwide in 1996 when scientists announced that it might contain evidence for microscopic fossils of Martian bacteria.

The Times has a hyped story on this.

Nasa scientists have produced the most compelling evidence yet that bacterial life exists on Mars.

It showed that microscopic worm-like structures found in a Martian meteorite that hit the Earth 13,000 years ago are almost certainly fossilised bacteria. The so-called bio-morphs are embedded beneath the surface layers of the rock, suggesting that they were already present when the meteorite arrived, rather than being the result of subsequent contamination by Earthly bacteria.

“This is very strong evidence of life on Mars,” said David Mackay, a senior scientist at the Nasa Johnson Space Centre , who was part of the team of scientists that originally investigated the meteorite when it was discovered in 1984.

In a 1996 study of the sample, Dr Mackay and others argued that the microfossils were evidence of life, but sceptics dismissed the claims, saying that similar-shaped structures might not be biological. The new analyses, the product of high resolution electron microscopy, make a strong case for the Allan Hills 84001 Meteorite having carried Martian life to Earth. The microscopes were focused on tiny magnetite crystals present in the surface layers of the meteorite, which have the form of simple bacteria. Some argued that these could be the result of a carbonate breaking down in the heat of the impact.

The abstract of the paper reads.

The Martian meteorite ALH84001 preserves evidence of interaction with aqueous fluids while on Mars in the form of microscopic carbonate disks. These carbonate disks are believed to have precipitated 3.9 Ga ago at beginning of the Noachian epoch on Mars during which both the oldest extant Martian surfaces were formed, and perhaps the earliest global oceans. Intimately associated within and throughout these carbonate disks are nanocrystal magnetites (Fe3O4) with unusual chemical and physical properties, whose origins have become the source of considerable debate. One group of hypotheses argues that these magnetites are the product of partial thermal decomposition of the host carbonate. Alternatively, the origins of magnetite and carbonate may be unrelated; that is, from the perspective of the carbonate the magnetite is allochthonous. For example, the magnetites might have already been present in the aqueous fluids from which the carbonates were believed to have been deposited. We have sought to resolve between these hypotheses through the detailed characterization of the compositional and structural relationships of the carbonate disks and associated magnetites with the orthopyroxene matrix in which they are embedded. Extensive use of focused ion beam milling techniques has been utilized for sample preparation. We then compared our observations with those from experimental thermal decomposition studies of sideritic carbonates under a range of plausible geological heating scenarios. We conclude that the vast majority of the nanocrystal magnetites present in the carbonate disks could not have formed by any of the currently proposed thermal decomposition scenarios. Instead, we find there is considerable evidence in support of an alternative allochthonous origin for the magnetite unrelated to any shock or thermal processing of the carbonates.

This suggests an ecological connection in so far as geological process seem to be eliminated.The paper has been online since June,and one might wonder if this is a 'good news" balance of the climategate problem.

Saturday, November 21, 2009

Climate Mafia,Mediocre Mathematicians and the Mind Projection Fallacy

The strength of a government depends on the people’s ignorance.Moreover, he said, the government is aware of this and would therefore always fight against the people’s education


The release of emails and other data exposing the mindset of climate scientists in resisting dissent by distorting or withholding of datasets, will bring questions,however the main uncertainty that needs to be asked is why global warming has stopped.

The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong.
- Kevin Trenberth, head of the Climate Analysis Section at the National Center for Atmospheric Research and a lead author of the 2001 and 2007 IPCC Scientific Assessment of Climate Change

This is a clear case of the mind projection fallacy (My own ignorance) - (Nature is indeterminate)eg Jaynes Probability as Logic,

It seems that mankind has always been occupied with the problem of how to deal with ignorance.Primitive man, aware of his helplessness against the forces of Nature but totally ignorant of their causes, would try to compensate for his ignorance by inventing hypotheses about them. For educated people today, the idea of directing intelligences willfully and consciously controlling every detail of events seems vastly more complicated than the idea of a machine running; but to primitive man (and even to the uneducated today) the opposite is true. For one who has no comprehension of physical law, but is aware of his own consciousness and volition, the natural question to ask is not:What is causing it?", but rather:Who is causing it?"

The answer was to invent Gods with the same consciousness and volition as ourselves, but with the additional power of psychokinesis; one in control of the weather, one in control of the seas, and so on. This personication of Nature must have been going on for thousands of years before it started producing permanent written records, in ancient Egypt and Greece. It appears that the adult citizens of those times really believed very literally in all their local Gods This oldest of all devices for dealing with one's ignorance, is the firrst form of what we havecalled the Mind Projection Fallacy".

One asserts that the creations of his own imagination are real properties of Nature, and thus in effect projects his own thoughts out onto Nature. It is still rampant today, not only in fundamentalist religion, but in every field where probability theory is used Of course, we are not arguing against a scientist's practice of formulating hypotheses about what is happening in Nature. Indeed, we see it as the highest form of creativity { far transcending mere mathematical manipulative skill { to conceive the right hypothesis in any field, out of one's educated imagination. Copernicus, Newton, Faraday, Darwin, Mendel, Pasteur, Wegener, Einstein are our heroes for having done this The difference between an imaginative scientist on the one hand, and primitive man and religious fundamentalists on the other, is that the scientist clearly recognizes the creations of his imagination as tentative working hypotheses to be tested by observation; and he is prepared to test and reject a hundred different hypotheses in order to find the right one.

Indeed as see the interbreeding of ideas of a small community of very ordinary scientists,has seen a field stagnate into a toxic cesspool that controls peer review and browbeats editors and censors ideas plausible or otherwise as a heresy.

Tommy Gold spoke of this at Nasa.

Another area where it is particularly bad is in the planetary sciences where NASA made great mistakes in the way in which they set up the situation. NASA made the grave mistake not only of working with a peer review system, but one where some of the peers (in fact very influential ones) were the in-house people doing the same line of work. This established a community of planetary scientists now which was completely selected by the leading members of the herd, which was very firmly controlled, and after quite a short time, the slightest departure from the herd was absolutely cut down. Money was not there for anybody who had a slightly diverging viewpoint. The conferences ignored him, and so on. It became completely impossible to do any independent work. For all the money that has been spent, the planetary program will one day be seen to have been extraordinarily poor. The pictures are fine and some of the facts that have been obtained from the planetary exploration with spacecraft - those will stand but not much else.

Indeed the decrease of scientific diversity (interbreeding) is akin to an Appalachian mining town.

Clearly these limitations are skill based and one could conjecture that this western phenomena is a result of the antiscientific mathematical cnstraint as outlined by Vladimir Arnold.

In the middle of the twentieth century a strong mafia of left-brained mathematicians succeeded in eliminating all geometry from the mathematical education (first in France and later in most other countries), replacing the study of all content in mathematics by the training in formal proofs and the manipulation of abstract notions. Of course, all the geometry, and, consequently, all relations with the real world and other sciences have been eliminated from the mathematics teaching.

Define the multiplication of natural numbers by the long multiplication rule. The
commutativity of the multiplication (ab = ba) becomes then a difficult theorem, which
one can however deduce logically from the definition. Forcing poor students to learn such proofs, the left-brained criminals had inevitably created the present negative opinion, of society and governments, of mathematics.

But the left-brained ill people have succeeded in breeding generations of mathematicians,who understand no other approach to mathematics and are able only to continue to teach it the same way. The aversion to mathematics of the ministers who have suffered through the humiliating teaching of this type in high school is a normal and healthy reaction.

Unfortunately, their aversion to mathematics is acting indiscriminately on all of it and can kill it completely. One of the dangerous trends is to eliminate the proofs from the high-school mathematics.The role of the proof for mathematics is similar to that for orthography or even calligraphy for poetry. A person, who had not mastered the art of the proofs in high school, is as a rule unable to distinguish correct reasoning from that which is misleading Such people can be easily manipulated by the irresponsible politicians. Mass hypnosis and the disastrous social events may result.

The continuity of the dumbing down of mathematics and science is criticized in the Royal society of Chemistry on the UK examination standards.

The question ran: 'A solid cube has sides of length 5 cm. Work out the total surface area of the cube. State the units of your answer.'

Dr Pike said, 'The correct answer can be arrived at within seconds, by noting that the area of each face of the cube is 5 x 5, or 25 square centimetres, and there are six faces to a cube, so that the result is 150 square centimetres.'

He added, 'Two important issues emerge from this. Firstly, schools in the independent sector are moving increasingly to International GCSEs (IGCSEs), which are seen as more demanding, and prepare pupils more effectively for A-Level. This is creating a two-tier system in education, since these are not recognised in the state sector, and is further attractive to schools because the curriculum is relatively stable, unlike for GCSEs where some observers see the incessant modifications as principally an income source for those involved in producing educational materials.

'Secondly, it raises the questions: Who is in charge of GCSEs? Who monitors? Who challenges? Why is it that, with dozens of agencies, authorities, boards, institutes and quangos, these extraordinary outcomes still surface?

Dr Pike said, 'Until we all get to grips with this fiasco, with some tough talking, this country risks sliding down the road to mediocrity'.

The last word returns to Tolstoy.

I know that most men, including those at ease with problems of the greatest complexity, can seldom accept even the simplest and most obvious truth, if it be such as would obliged them to admit the falsity of conclusions which they have delighted in explaining to colleagues, which they have proudly taught to others, and which they have woven, thread by thread, into the fabric of their lives.

Tuesday, November 17, 2009

Random walks or how models drown in a sea of mathematical theory.

What I suggest is that in a persistent way, a system may exhibit historical behaviour, instead of recurrence.

David Ruelle a question to Yasha Sinai.

Another interesting paper from Demetris Koutsoyiannis shows the divergence of skill,between modelers and mathematical physicists.

According to the traditional notion of randomness and uncertainty, natural phenomena are separated into two mutually exclusive components, random (or stochastic) and deterministic. Within this dichotomous logic, the deterministic part supposedly represents cause-effect relationships and, thus, is physics and science (the “good”), whereas randomness has little relationship with science and no relationship with understanding (the “evil”). We argue that such views should be reconsidered by admitting that uncertainty is an intrinsic property of nature, that causality implies dependence of natural processes in time, thus suggesting predictability, but even the tiniest uncertainty (e.g., in initial conditions) may result in unpredictability after a certain time horizon. On these premises it is possible to shape a consistent stochastic representation of natural processes, in which predictability (suggested by deterministic laws) and unpredictability (randomness) coexist and are not separable or additive components. Deciding which of the two dominates is simply a matter of specifying the time horizon of the prediction. Long horizons of prediction are inevitably associated with high uncertainty, whose quantification relies on understanding the long-term stochastic properties of the processes.


"Long horizons of prediction are inevitably associated with high uncertainty,"

Indeed as DK cites the Kolmogorov-Chaiten problem of convergence-divergence due to slow-fast oscillators where relaxation times of the oscillator may be hiddem in the mists of time such as spin glasses etc. or randomness may be simply exhibiting historical in stead of recurrent behavior as rigorously proven by Yasha Sinai.

eg Ya G Sinai (1982) Limit behaviour of one-dimensional random walks in random environments,

Some more examination of this paper and the previous post is in order.

Models and muddles the divergence of science from theory.

It is inevitably necessary to think of all as contained within one
nature; one nature must hold and encompass all; . . . But within the
unity There, the several entities have each its own distinct existence.

Nalimov Faces of science p136

Models are abstracts of the real world (and are bounded by our understanding of natures laws).The models are formally experiments or measurements that unfold our understanding of real world process.If a model or experiment is published that violates natures laws and yet correlates with the experiment,we are left with a simple binary question (and answer).That either the model(experiment) is incorrect or the natural law is wrong.

One would think that this would be a significant barrier to publication in a kournal such as nature or science,but this seems not to be the case.

Ontogenetic growth: models and theory

Anastassia M. Makarieva, Victor G. Gorshkov and Bai-Lian Li

We re-analyze the assumptions underlying two recently proposed ontogenetic growth models [Nature 413 (2001) 628; Nature 417 (2002) 70] to find that the basic relations in which these models are grounded contradict the law of energy conservation. We demonstrate the failure of these models to predict and explain several important lines of empirical evidence, including (a) the organismal energy budget during embryonic development; (b) the human growth curve; (c) patterns of metabolic rate change during transition from embryonic to post-embryonic stages; and (d) differences between parameters of embryonic growth in different taxa. We show how a theoretical approach based on well-established ecological regularities explains the observations where the formal models fail. Within a broader context, we also discuss major principles of ontogenetic growth modeling studies in ecology, emphasizing the necessity of ecological theory to be based on assumptions that are testable and to be formulated in terms of variables and parameters that are measurable.

Evidently a problem but the continued repeating of the error in subsequent papers,suggests a significant failure of the peer review process.

eg Anastassia Makarieva

The validity of the fundamental laws of nature and of good theories based on them
has been tested on such a great amount of empirical data that it is a good theory that can tell you whether the empirical data are of good or bad quality rather than the data tell you something about the theory. For this reason, good theories can be used for making predictions, like the existence of many elementary particles was predicted in theoretical physics prior to their actual discovery. How justified is the use of models for making predictions?

During model development the priority is given to reaching a satisfactory agreement
between the data and the mathematical structure of the model. On the basis of theavailable sets of data points taken from the general ensemble of all empirical evidence the modelers determine linear and non-linear correlations between the chosen measurable variables, including their temporal changes. The resulting time dependence of model variables allows one to make a forecast for the future. Such a forecast, however, is nothing but a limited extrapolation of what has been observed in the past. With changing the empirical datasets the model structure and forecasts change. With inclusion of ever growing amounts of observations the models become more and more complex, while their agreement with the available observations naturally improve. Thus, an ideal model ultimately comes as an exact and convenient, i.e. mathematically formalized, representation of all the available data. However, to the degree the model is a model and not a theory, it lacks the predictive power. Because of the obvious fact that it cannot be expected that the calibrations made on the basis of the knowndata will remain valid in the domain of predicted (i.e. still unknown) data. This is a conceptual, fundamental problem with the modeling approach. The universal laws of nature predict things

Based on our own scientific expertise, we can illustrate the above points with specific examples of models that were judged to be most successful based on their agreementwith the data and claimed derivability from a "universal" theory, yet shown to confront the fundamental laws of nature. As one can see, the problem transcends across the natural science as a whole. The biological model of organismal growth (West et al., 2001) misinterpreted the energy conservation equation and replaced it with the one conflicting with the energy conservation law. Despite that, the model showed perfect agreement with the data. After the error was identified (Makarieva et al., 2004) it took the model’s authors four years to explicitly admit it (Moses et al., 2008) and re-formulate the model. The re-formulated model re-calibrated using the same data as the original (wrong) one showed equally good agreement with the data and got equally well published (Hou et al., 2008). Thus, irrespective of conflicting with the energy conservation law or not, the model agreed with the data, was widely cited and raised little concern in the reading audience

One could reflect on Alven Tofflers suggestion that the illiterate of the future will not be an inability to read and write.but to learn and think.

Web Counters