Friday, November 24, 2017
Galaxy Song - Monty Python Live in O2 Arena.
me, that we live on a big rock, rotating at 1037 miles an hour (less
as you move away from the equator), going around the Sun at 67,000
MPH, and moving around the galaxy at unfathomable speed, and all of
this is like a grain of sand on the beach of an immense cosmic ocean
that is forever expanding at an ever accelerating rate.
However, this video expresses it much better:
https://www.youtube.com/watch?v=fqBThWK8rqE
Sunday, November 19, 2017
Universe (Copy of Facebook Post.)
Thursday, October 26, 2017
Thursday, September 28, 2017
Quantum Fields: The Real Building Blocks of the Universe - with David Tong
Wednesday, September 13, 2017
The GOLDEN AGE of SCIENCE FICTION Television
Sunday, September 10, 2017
Fwd: Google Alert - telomeres
telomeres | ||
NEWS | ||
This is the secret to a longer life The researchers focused on telomeres, which are proteins found in the cell's nucleus that stabilize the ends of chromosomes. Confused? Let me ... |
Saturday, September 9, 2017
What if We Never Went to the Moon?
Tuesday, September 5, 2017
Fwd: Reducing many age related diseases
Researchers at Mayo Clinic's Robert and Arlene Kogod Center on Aging developed the first senolytic drugs to target these harmful cells. In a recent study led by The Scripps Research Institute, Mayo Clinic researchers and others confirmed that the senolytic drugs discovered at Mayo effectively clear senescent cells while leaving normal cells unaffected. The study, which was published in Nature Communications, also describes a new screening platform for finding additional senolytic drugs that will more optimally target senescent cells. The platform, together with additional human cell assays, identified and confirmed a new category of senolytic drugs, which are called HSP90 inhibitors.
https://www.sciencedaily.com/
Friday, August 25, 2017
Tuesday, August 22, 2017
Friday, August 18, 2017
Fields
We are made off stuff. If you break the stuff down to the smallest possible level, you get elementary particles. But according to Quantum Field Theory, what we experience as elementary particles are just fluctuations in the fields for those particles, in the same way a photon is a fluctuation in the electromagnetic field, and an electron is just a fluctuation the electric field. These fields overlap each other and exist throughout all of space. They interact, like the way the Higgs field gives particles mass. Reality for us is what we perceive, but the true reality may be just the fields. You could think of them as God's computer program for the Universe.
Monday, August 14, 2017
The Quantum Experiment that Broke Reality | Space Time | PBS Digital Studios
Friday, August 11, 2017
Monday, August 7, 2017
Re: Top 10 Climate Change Lies Exposed
Another quick rundown of global warming lies. You've mentioned most of these issues in your debates with friends online.
https://youtu.be/ICGal_8qI8c
Wednesday, August 2, 2017
Re: Global Climate
Wednesday, June 21, 2017
Climate Change
Thursday, June 8, 2017
Wednesday, May 3, 2017
Monckton's Mathematical Proof - Climate Sensitivity is Low
In other words, if you argue that the Earth has a low climate sensitivity to CO2, you are also arguing for a low climate sensitivity to other influences such as solar irradiance, orbital changes, and volcanic emissions. In fact, as shown in Figure 1, the climate is less sensitive to changes in solar activity than greenhouse gases. Thus when arguing for low climate sensitivity, it becomes difficult to explain past climate changes. For example, between glacial and interglacial periods, the planet's average temperature changes on the order of 6°C (more like 8-10°C in the Antarctic). If the climate sensitivity is low, for example due to increasing low-lying cloud cover reflecting more sunlight as a response to global warming, then how can these large past climate changes be explained?
https://www.skepticalscience.com/climate-sensitivity-advanced.htm
Sunday, April 30, 2017
Climate Sensitivity Reconsidered
Final climate sensitivity
Substituting in Eqn. (1) the revised values derived for the three factors in ΔTλ, our re-evaluated central estimate of climate sensitivity is their product –
ΔTλ= ΔF2x κ f ≈ 1.135 x 0.242 x 2.095 ≈ 0.58 °K (30)
Theoretically, empirically, and in the literature that we have extensively cited, each of the values we have chosen as our central estimate is arguably more justifiable – and is certainly no less justifiable – than the substantially higher value selected by the IPCC. Accordingly, it is very likely that in response to a doubling of pre-industrial carbon dioxide concentration TS will rise not by the 3.26 °K suggested by the IPCC, but by <1 °K.
Discussion
We have set out and then critically examined a detailed account of the IPCC's method of evaluating climate sensitivity. We have made explicit the identities, interrelations, and values of the key variables, many of which the IPCC does not explicitly describe or quantify. The IPCC's method does not provide a secure basis for policy-relevant conclusions. We now summarize some of its defects.
The IPCC's methodology relies unduly – indeed, almost exclusively – upon numerical analysis, even where the outputs of the models upon which it so heavily relies are manifestly and significantly at variance with theory or observation or both. Modeled projections such as those upon which the IPCC's entire case rests have long been proven impossible when applied to mathematically-chaotic objects, such as the climate, whose initial state can never be determined to a sufficient precision. For a similar reason, those of the IPCC's conclusions that are founded on probability distributions in the chaotic climate object are unsafe.
Not one of the key variables necessary to any reliable evaluation of climate sensitivity can be measured empirically. The IPCC's presentation of its principal conclusions as though they were near-certain is accordingly unjustifiable. We cannot even measure mean global surface temperature anomalies to within a factor of 2; and the IPCC's reliance upon mean global temperatures, even if they could be correctly evaluated, itself introduces substantial errors in its evaluation of climate sensitivity.
The IPCC overstates the radiative forcing caused by increased CO2 concentration at least threefold because the models upon which it relies have been programmed fundamentally to misunderstand the difference between tropical and extra-tropical climates, and to apply global averages that lead to error.
The IPCC overstates the value of the base climate sensitivity parameter for a similar reason. Indeed, its methodology would in effect repeal the fundamental equation of radiative transfer (Eqn. 18), yielding the impossible result that at every level of the atmosphere ever-smaller forcings would induce ever-greater temperature increases, even in the absence of any temperature feedbacks.
The IPCC overstates temperature feedbacks to such an extent that the sum of the high-end values that it has now, for the first time, quantified would cross the instability threshold in the Bode feedback equation and induce a runaway greenhouse effect that has not occurred even in geological times despite CO2 concentrations almost 20 times today's, and temperatures up to 7 ºC higher than today's.
The Bode equation, furthermore, is of questionable utility because it was not designed to model feedbacks in non-linear objects such as the climate. The IPCC's quantification of temperature feedbacks is, accordingly, inherently unreliable. It may even be that, as Lindzen (2001) and Spencer (2007) have argued, feedbacks are net-negative, though a more cautious assumption has been made in this paper.
It is of no little significance that the IPCC's value for the coefficient in the CO2 forcing equation depends on only one paper in the literature; that its values for the feedbacks that it believes account for two-thirds of humankind's effect on global temperatures are likewise taken from only one paper; and that its implicit value of the crucial parameter κ depends upon only two papers, one of which had been written by a lead author of the chapter in question, and neither of which provides any theoretical or empirical justification for a value as high as that which the IPCC adopted.
The IPCC has not drawn on thousands of published, peer-reviewed papers to support its central estimates for the variables from which climate sensitivity is calculated, but on a handful.
On this brief analysis, it seems that no great reliance can be placed upon the IPCC's central estimates of climate sensitivity, still less on its high-end estimates. The IPCC's assessments, in their current state, cannot be said to be "policy-relevant". They provide no justification for taking the very costly and drastic actions advocated in some circles to mitigate "global warming", which Eqn. (30) suggests will be small (<1 °C at CO2 doubling), harmless, and beneficial.
Conclusion
Even if temperature had risen above natural variability, the recent solar Grand Maximum may have been chiefly responsible. Even if the sun were not chiefly to blame for the past half-century's warming, the IPCC has not demonstrated that, since CO2 occupies only one-ten-thousandth part more of the atmosphere that it did in 1750, it has contributed more than a small fraction of the warming. Even if carbon dioxide were chiefly responsible for the warming that ceased in 1998 and may not resume until 2015, the distinctive, projected fingerprint of anthropogenic "greenhouse-gas" warming is entirely absent from the observed record. Even if the fingerprint were present, computer models are long proven to be inherently incapable of providing projections of the future state of the climate that are sound enough for policymaking. Even if per impossibile the models could ever become reliable, the present paper demonstrates that it is not at all likely that the world will warm as much as the IPCC imagines. Even if the world were to warm that much, the overwhelming majority of the scientific, peer-reviewed literature does not predict that catastrophe would ensue. Even if catastrophe might ensue, even the most drastic proposals to mitigate future climate change by reducing emissions of carbon dioxide would make very little difference to the climate. Even if mitigation were likely to be effective, it would do more harm than good: already millions face starvation as the dash for biofuels takes agricultural land out of essential food production: a warning that taking precautions, "just in case", can do untold harm unless there is a sound, scientific basis for them. Finally, even if mitigation might do more good than harm, adaptation as (and if) necessary would be far more cost-effective and less likely to be harmful.
In short, we must get the science right, or we shall get the policy wrong.
https://www.aps.org/units/fps/newsletters/200807/monckton.cfm
Friday, January 27, 2017
Metallic hydrogen
To create it, Silvera and Dias squeezed a tiny hydrogen sample at 495 gigapascal, or more than 71.7 million pounds-per-square inch - greater than the pressure at the center of the Earth. At those extreme pressures, Silvera explained, solid molecular hydrogen -which consists of molecules on the lattice sites of the solid - breaks down, and the tightly
predictions suggest metallic hydrogen could act as a superconductor at room temperatures.
Among the holy grails of physics, a room temperature superconductor, Dias said, could radically change our transportation system, making magnetic levitation of high-speed trains possible, as well as making electric cars more efficient and improving the performance of many electronic devices.
The material could also provide major improvements in energy production and storage - because superconductors have zero resistance energy could be stored by maintaining currents in superconducting coils, and then be used when needed.
"It takes a tremendous amount of energy to make metallic hydrogen," Silvera explained. "And if you convert it back to molecular hydrogen, all that energy is released, so it would make it the most powerful rocket propellant known to man, and could revolutionize rocketry."
The most powerful fuels in use today are characterized by a "specific impulse" - a measure, in seconds, of how fast a propellant is fired from the back of a rocket - of 450 seconds. The specific impulse for metallic hydrogen, by comparison, is theorized to be 1,700 seconds.
"That would easily allow you to explore the outer planets," Silvera said. "We would be able to put rockets into orbit with only one stage, versus two, and could send up larger payloads, so it could be very important."
https://www.google.com/amp/s/p