Monday, June 25, 2018

selection biases


   selection biases, also known as survivorship bias, or survival bias

 <---------------------------------------------------------------------------->

Mario Livio, Brilliant blunders, 2013                                       [ ]

pp.258-259
p.258
   Statisticians always dread selection biases.  These are the distortions of the results, introduced either by data-collecting tools or by the method of data accumulation.  Here are a few simple examples to demonstrate the effect.  Imagine that you want to test an investment strategy by examining the performance of a large group of stocks against twenty (20) years' worth of data.  You might be tempted to include in the study only stocks for which you have complete information over the entire twenty-year period.  However, eliminating stocks that stopped trading during this period would produce biased results, since these were precisely the stocks that did not survive the market. 
p.259
   During World War II, the Jewish Austro-Hungarian mathematician Abraham Wald demonstrated a remarkable understanding of selection bias.  Wald was asked to examine data on the location of enemy fire hits on bodies of returning aircraft, to recommend which parts of the airplanes should be reinforced to improve survivability.  To this superiors' amazement, Wald recommended adding armor to the locations that showed no damage.   His unique insight was that the bullet holes that he saw in surviving aircraft indicates places where an airplane could be hit and still endure.  He therefore concluded that the planes that had been shot down were probably hit precisely in those places where the persevering planes where lucky enough not to have been hit. 
p.259 
   Astronomers are very familiar with the Malmquist bias (named after the Swedish astronomer Gunnar Malmquist, who greatly elaborated upon it in the 1920s).  When astronomers survey stars or galaxies, their telescopes are sensitive only down to a certain brightness.  However, objects that are intrinsically more luminous can be observed to greater distances.  This will create a false trend of increasing average intrinsic brightness with distance, simply because the fainter objects will not be seen. 
p.259 
   Brandon Carter pointed out that we shouldn't take the Copernican principle ── the fact that we are nothing special in the cosmos ── too far.  He reminded astronomers that humans are the ones who make observations of the universe; consequently, they should not be too surprised to discover that the properties of the cosmos are consistent with human existence.  For instance, we could not discover that our universe contains no carbon, since we are carbon-based life-forms.  Initially, most researchers took Carter's anthropic reasoning to be nothing more than a trivially obvious statement. 

   (Brilliant blunders: from Darwin to Einstein ─ colossal mistakes by great scientists that changed our understanding of life and the universe / Mario Livio.,  1. errors, scientific., Q172.5.E77L58  2013, 500─dc23, first Simon & Schuster hardcover edition May 2013, 2013, )
 <---------------------------------------------------------------------------->

  Lobster
  over fishing 
  Salmon run 
  biomass 

  https://en.wikipedia.org/wiki/Selection_bias
 <---------------------------------------------------------------------------->

Dorothy Leonard-Barton, Wellsprings of Knowledge, 1995                      [ ] 

p.167
    ...  Similarly, researchers who analyzed R&D (research and development) investment in 318 U.S. manufacturing firms argued that "the ability to evaluate and utilize outside knowledge is largely a function of the level of prior related knowledge." 71  Therefore, paradoxically, firms that already do some research in a field are best positioned to import related knowledge.  At the same time, owing to the selection biases described in Chapter 2, managers should be encouraged to consider technologies that are not already a core competency.

    (Leonard-Barton, Dorothy, copyright © 1995, HD30.2.L46 1995, 658.4'038——dc20)
(Wellsprings of Knowledge : building and sustaining the sources of innovation / Dorothy Leonard-Barton, 1. information technology——management, 2. information resources management, 3. management information systems, )
 <---------------------------------------------------------------------------->

Charles Perrow, Normal accidents : living with high-risk technologies, 1999 [ ]

p.318
   One important and unintended conclusion that does come from this work is the overriding importance of the context into which the subject puts the problem. 
p.318
The decisions made in these cases were perfectly rational; it was just that the operators were using the wrong contexts. Selecting a context (“this can happen only with a small pipe break in the secondary system”) is a pre-decision act, made without reflection, almost effortlessly, as a part of a stream of experience and mental processing. 
p.318
And defining the context is a much more subtle, self-steering process, influenced by long experience with trials and errors (much as the automatic adjustments made in driving or walking on a busy street). 

p.318
   For example, take the supposedly widespread failure to take the “base rate” into account (all the past events, such as the number of flights were there were no accidents, or the proportion of single people in a community).  

   ( Normal accidents : living with high-risk technologies / Charles Perrow, 1. industrial accidents., 2. technology--risk assessment., 3. accident., HD7262  P55  1999, 363.1--dc21, 1999,  )
 <---------------------------------------------------------------------------->

Hans Rosling, Factfulness, 2018                                             [ ]

p.124   The deaths I do not see 

p.126
I can save more children if I improve the services outside the hospital.  I am responsible for all the children deaths in this district:  the deaths I do not see just as much as the deaths in front of my eyes

p.127
Organizing, supporting, and supervising basic community-based health care that could treat diarrhea, pneumonia, and malaria before they became life-threatening would save more lives ... 

p.127
train village health workers, vaccination, small health facilities that could be reached even by mothers who had to walk.

p.127
hundreds of anonymous dying children I could not see. 

p.127   Ingegred Rooth
“In the deepest poverty you should never do anything perfectly. If you do you are stealing resources from where they can be better used.”

   (Hans Rosling with Ola Rosling and Anna Rosling Rönnlund, Factfulness : ten reasons we're wrong about the world ── and why things are better than you think, 155.9042  Rosling, 2018, ) 

 <---------------------------------------------------------------------------->

No comments:

Post a Comment