Monday, November 20, 2017

1997..................artificial intelligence...........nano tech....................the greater goal.........behind using a super computer to perfect chess.......................might have been to make better computers................

They would need a computer powerful enough to go through the myriad of combinations...........which is why they used a super computer..................................that was 20 years ago.................................................u might not think nano tech exists........it does .........they tried to attack me with some............



Synopsis[edit]

The book emphasizes Silver's skill, which is the practical art of mathematical model building using probability and statistics. Silver takes a big-picture approach to using statistical tools, combining sources of unique data (e.g., timing a minor league ball player's fastball using a radar gun), with historical data and principles of sound statistical analysis, many of which are violated by many pollsters and pundits who nonetheless have important media roles. The book includes richly detailed case studies from baseball, elections, climate change, the financial crash, poker, and weather forecasting. These different topics illustrate different statistical principles. For example, weather forecasting is used to introduce the idea of "calibration," or how well weather forecasts fit actual weather outcomes. There is much on the need for improved expressions of uncertainty in all statistical statements, reflecting ranges of probable outcomes and not just single "point estimates" like averages. Silver would like to see the media move away from vague terminology like "Obama has an edge in Ohio" or "Florida still a toss-up state" to probability statements, like "the probability of Obama winning the electoral college is 83%, while the expected fraction won by him of the popular vote is now 50.1% with an error range of ±2%". Such statements give odds on outcomes, including a 17% chance of Romney winning the electoral college. The shares of the popular vote similarly are ranges including outcomes in which Romney gets the most votes. What is highly probable is that the voting shares are in these ranges, but not whose share is highest; that's another probability question with closer odds. From such information, it's up to the consumer of such statements to use that information as best they can in dealing with an uncertain future in an age of information overload. That last idea frames Silver's entire narrative and motivates his pedagogical mission.
Silver rejects much ideology taught with statistical method in colleges and universities today, specifically the 'frequentist" approach of Ronald Fisher, originator of many classical statistical tests and methods. The problem Silver finds is a belief in perfect experimental, survey, or other designs, when data often comes from a variety of sources and idealized modeling assumptions rarely hold true. Often such models reduce complex questions to overly simple "hypothesis tests" using arbitrary "significance levels" to "accept or reject" a single parameter value. In contrast, the practical statistician first needs a sound understanding of how baseball, poker, elections or other uncertain processes work, what measures are reliable and which not, what scales of aggregation are useful, and then to utilize the statistical tool kit as well as possible. Silver believes in the need for extensive data sets, preferably collected over long periods of time, from which one can then use statistical techniques to incrementally change probabilities up or down relative to prior data. This "Bayesian" approach is named for the 18th century minister Thomas Bayes who discovered a simple formula for updating probabilities using new data. For Silver, the well-known method needs revitalizing as a broader paradigm for thinking about uncertainty, founded on learning and understanding gained incrementally, rather than through any single set of observations or an ideal model summarized by just a few key parameters. Part of that learning is the informal process of changing assumptions or the modeling approach, in the spirit of a craft whose goal is to devise the best betting odds on well-defined future events and their outcomes.

Criticism and commentary[edit]

Climate scientist Michael E. Mann criticized the book for analyzing the "hard science" physical phenomena of climate trends with the same approach as used to analyze the social phenomena of voter preferences, which he characterized as "laden with subjective and untestable assumptions".[20]
Murray Cantor, IBM Distinguished Engineer, wrote "Nate Silver's The Signal and Noise is an excellent description of how prediction works. However, he purposefully leaves out the mathematics... In 2012, after his triumph of predicting the outcome of the last two presidential elections and selling his "fivethirtyeight" blog to the New York Times, Nate Silver accomplished what is almost impossible. In his recent book The Signal and the Noise, he correctly describes the discipline of making predictions, without explicitly invoking the math. He accomplishes this feat even though the prediction methods he describes require more than one kind of mathematics. By leaving out the math, he has reached a broad audience with a compelling book with lots of examples

No comments:

Post a Comment