Search This Blog

Wednesday, 30 May 2018

ON THE CURRENT EMPIRICAL STATE OF NEUROSCIENCE

      

                                                                                     





                                  ON THE CURRENT EMPIRICAL STATE OF NEUROSCIENCE

                                                 A Review of David Dunson's centennial talk 

                                                           by Thomas H. Leonard

THE FOLLOWING DISCUSSION GETS SCIENTIFIC
 SUMMARY FOR LAYMAN: VIRTUALLY EVERY ATTEMPT TO OBJECTIFY BRAIN FUNCTION FROM STATISTICAL DATA AMOUNTS TO UTTER B.S.
There are of the order of 100 billion neurons in the human brain, and it is currently TECHNICALLY IMPOSSIBLE to observe their behaviour, The latest state of the art magnetic resonance imaging (MRI) can however estimate the locations of the million or so CONNECTOMES in the human brain. These are sets of tracts which contain the neurons.

        None of this imaging has to date been able to adequately distinguish between mental disorders or between 'points' on the autistic-related spectrum, and any claims by neuroscientists or neuro-psychiatrists to the contrary are entirely misleading. If you see an impressive looking coloured picture of the brain then it is very likely to have been based on quasi-science i.e, gobblydegook,

       From a statistical viewpoint, it is in principle possible to relate an individual's connectome pattern to a variety of different factors e.g. substance abuse, education, and presumed mental condition. It would however be necessary to make some pretty strong sampling distribution assumptions on highly complex function space for the patterns of connectomes, and this would also need to model the 'regression' of these patterns on the explanatory variables,.

       From a Bayesian perspective, it would then be necessary to fairly arbitrarlly select a high complex prior distribution for the swathe of unknown parameters in the model. Owing to these complexities, the prior to posterior computations would undoubtedly need be attempted using some form of the Metropolis Algorithm, a strange in vogue procedure introduced by the 'Bayesian politicians' into Statistics during the 1990s, and which very rarely converges to the right answer, even under much simpler sampling and prior assumptions.

       Such were the general themes of an outstanding presentation by the world leading Bayesian Statistician Professor David Dunson of Duke University at the David Finney Centennial Lecture in Edinburgh on 29th May 2018. As he himself  was very ready to clarify, his tentative analysis of the observations on less than 100 individuals in an already heavily skewed, unreplicated sample, epitomised our concerns about the empirical state of neuroscience today. 

      Any attempt at using Statistics to infer causality would of course, as always, be entirely fallacious, as David Hume, the great philosopher of the Scottish Enlightenment clarified as early as the 18th century. Causality can never be inferred from correlation because of the possible influence of any number of 'lurking' or 'confounding' variables. The claims in the penultimate sentence (italicised below) of Professor Dunson's synopsis are therefore highly speculative, to say the least, 

        Arch-Bayesians need to become better acquainted with the the rules of Statistics, as recounted by George Box, Stuart Hunter, the Wisconsin School, the 19th century American philosopher  Charles Peirce, the albeit harmful eugenicist Sir Ronald Fisher and many others.




                                                DAVID DUNSON'S CENTENIAL LECTURE


                                                                       



                                                                       SYNOPSIS


""""There have been parallel revolutions in recent years in technology for imaging of the human brain and in methods for analyzing high-dimensional and complex data. We are interested in exploiting and building on this technology motivated by interest in studying how people vary in their brain connection structure. White matter tracks in the human brain consist of tightly bundled sets of neurons that are geometrically aligned and act as highways for neural activity and communication across the brain. There are on the order of a million such tracts in a normal human brain, and their locations can be estimated using different types of magnetic resonance imaging (MRI) combined with state-of-the-art image processing. We refer to the set of tracts as the human brain “connectome.” The Human Connectome Project (HCP) collects data on connectomes, along with multiple behaviours and traits of each individual under study. We develop state of the art data science tools to study variation in connectomes, and the relationship with factors, such as substance use (alcohol, marijuana) and education. We find a significant relationship between brain connectivity and multiple factors, with high levels of substance use decreasing connectivity and education increasing connectivity. This talk is designed to be accessible to the general public, focusing on describing these amazing new data resources, analysis tools, and results, with a discussion on ongoing directions""""


                                                                         
                 

1 comment: