Saturday 19 August 2006

Myths and misconceptions about the Big Bang

The Angry Astronomer has a post listing some common misconceptions about the Big Bang.

I'll give you the list; go see the Astronomer for the reason.

1) The Big Bang was an explosion
2) The Big Bang theory doesn’t explain what caused it
3) There’s no evidence for the Big Bang
4) The Big Bang doesn’t leave room for God

On the last, I'd say the Big Bang is utterly irrelevant to questions of theism or atheism, which is essentially the point AA is making.

LINK: The Big Bang - common misconceptions - The Angry Astronomer [Hat tip Stephen Hicks]

RELATED: Science, Religion, History

14 comments:

Berend de Boer said...

I'll give a few more.

e. The Big Bang Theory hasn't made a single correct prediction.
f. There is no observational evidence for the Big Bang, the opposite actually.
g. The Big Bang won't last the next 15 years.

Anonymous said...

Bernard Darnton said...
[Cosmic microwave background. Predicted in 1948 by George Gamow, observed in 1965 by Penzias and Wilson, observed in ludicrously precise detail by COBE project in 1992.]

I completely agreed with Bernard here. Penzias & Wilson won a Nobel Prize for this discovery. Their discovery was an accident, they were not looking for the cosmic micrsowave, but stumbled across some unknown noise in their experiment. They varied the setup of their experiment but this noise, is everywhere regardless of which direction the telescope was pointed to in the sky. They varied the time they take measurement (day & night, season of the year), but still this unexplained noise still appeared. When they consulted some theoretical physicists (Penzias & Wilson are experimentalists and not theoretical physicists) about that unexplained noise phenomenon, the advise they got was to look into the papers of George Gamow for the answer. When Penzias & Wilson dug up those papers from the literature, they applied Gamow's model to their data and BANG. There was the 2.7 degrees microwave radiation predicted matched within the statistical error. This Cosmic microwave radiation left-over from the BIG BANG, is used today in space navigation & positioning of satellites since everything moves relative to each other (expanding universe), the Cosmic microwave is used to measure how positions of stars, planets are measured, since it is everywhere in space .

Anonymous said...

after COBE wilconson microwave arisotropy probe was launched and gave even more precise measurements of the background radiation. in 1998 it was discovered that the expansion of the universe was accelerating by measuring the distance of galaxies by measuring the relative luminosity of type 1a supernovas in the galaxies wicth are refered to as standard candles since they have the same luminosity because a type 1a supernova is caused by a binary star system in witch a nutron star draws gas from its close by red giant companion ,because a neutron star is the remnant of a previous supernova it has no fuel remaning so is a solid mass off iron about 20 km diameter.as soon as the mass of the gas sucked off the red giant reaches about 8 times the mass of the sun around the iron core the the gravitational collapse is so violent that a supernova accurs(remember there is no nuclear fusion happaning because the fuel spent iron nuetron star core).this means that the energy realesed is always the same and predictible using computer modeling hence the term standard candle.
from these observations it was deduced that 73 percent of the universe is made of a gravitationaly repulsive force refered to as dark energy 22 percent being dark unobserved matter and 5 percent visible stars and galaxies.from this a pridiction was made very precisely what the background radiation would be witch was confirmed by the much more accurate wilkinson probe

Anonymous said...

actualy i made a mistake ,atype 1a supernova is a red giant and white dwarf binary star both in the process off running out of hydrogen fuel but at different stages, and both under the lwer mass limit for becoming a supernova on their own right witch is eight solar masses.when a binary star first formes from an accretion disk the mass at the center collapses into two stars,this is very common 83 percent of stars are binary stars.
when stars of less than eight solar masses run out of fuel they expand into red giants as the fusion reaction forming helium from hydrogen moves out from the center then collapse much less violently than supernovas because there is not enough heat and pressure to fuse much heaverier elements.
the red giant and white dwarf are just at different stages in this process because of differences in mass at the intial formation of the two stars causing the more massive one to expend its fuel quicker and becomea white dwarf sooner

Berend de Boer said...

Ha, the cosmic background radiation prediction. A classic case for whoever studies the history of science.

Berend de Boer said...

And by the way, there is no uniform cosmic background radiation. It's a fairly local phenomenon.

But that doesn't matter because Big Bang is a religion. Never try to argue facts with adherents of a "scientific" religion.

Anonymous said...

Berend de Boer said...
[Ha, the cosmic background radiation prediction.]

Are you saying that the 'cosmic background radiation prediction' specifically is bullshit or are you implying that the whole predictions & discoveries in Physics are all bullshit as well? If you say yes, then I would like to run you thru many Physics predictions that were made far in advance, before experimental observations were confirmed to be correct. The longest to date was the BEC or 'Bose-Einstein-Condensation' (fourth state of matter), which was predicted in a paper published by Einstein in 1923, and then first confirmation of this was observed in 1995, by 2001 Physics Nobel Prize Winner , Professor Carl E. Wieman. I attended 2 of his 3 public lectures at Auckland Uni last year (2005). We (Otago University Physics Department), were the first in the southern hemisphere to reproduce BEC in 1998 and well-done Otago. BEC is expected to revolutionise the manufacturing of IC (Integrated Circuits) and also nano-technology. That means, that producing electronic semi-conductor devices can reduce its size, less power consumption, increase electrical conductions, etc. This implies, Berend that computer chip produced using BEC-based manufactured semiconductor devices can be much faster than what is currently available today. Physicist are still working on it, but when it materialises, consumers will have better home or general use electronic products. Peter Creswell has blogged about Professor Carl E. Wieman's lectures, and may be he can point you out to those links. My point here, if you bash 'Physics Predictions’, then explain why you do so or else point to which literature that a published paper has refuted such predictions.

"Nobel Prize Winner to Visit Otago Physics Colleagues"
http://www.otago.ac.nz/news/news/2005/17-10-05_press_release.html

"Otago Ultra Cold Atom Experiments"
http://www.physics.otago.ac.nz/research/uca/Experiments.html

Berend de Boer said...
[A classic case for whoever studies the history of science.]

By that I mean, your information are relied on historians & journalists and not Physicists? My, God, you might be one of those who read the Herald business section and endorsed the view of those anti-business journalists & writers. Remember, Peter Nowak? His anti-telecom writing have polarised the public into believing his misinformation about unbundling Telecom lines, which in his words, "unbundling will solve the problem of high-volume & high-speed data transmission". Well, figure it out. This misunderstanding of the nature of transmission lines came from WHO? Of course, a journalist.

Just be careful in your physics prediction bashing, as your computer, is using laser to read & write data, that enables you to email this blog site. Laser was predicted by Einstein in a paper he published in 1916 and the first verification that such process (population inversion) do indeed exist came in 1958 by Charles H. Townes who actually won a Nobel Prize for it later. Laser was born in the late 1950s and most technologies around us today are dominated by laser, from medical applications, telecommunications (fibre-optics, etc), industrial manufacturing, and consumer goods. I can quote more physics predictions, which have been confirmed by experiments, but perhaps the 2 that I have mentioned are enough.

"Laser - Population Inversion"
http://www.colorado.edu/physics/2000/lasers/lasers3.html

Berend de Boer said...

falafulu, I was specific wasn't I. So "specifically" is the answer.

Anonymous said...

cmb localised??? The WMAC mision has mapped a faily massive chunk of sky and found and never found any piece of sky that doesnt have cmb radiation.In fact i think it has mapped thole sky in order to find repeating paterns (this would indicate a house of mirrors effect if the universe was closed,kinda like pac man go out one end and in the other). the WAMP probe sits at lagrange point two which is a stable orbit point out beyond the moon where centrifugal force matches the gravity of the sun,earth,and moon.a much better vantage point to observe the entire sky,but maybe some peaple prefer the incoherant writings og bronze age goat herders

Anonymous said...

Berend de Boer said...
[So "specifically" is the answer.]

So, in that case I would suggest that you lobby the Nobel Prize comittee to withdraw the awards given to Prof. Penzias & Prof.Wilson , their discovery of the cosmic background radiation. Yes, Nobel Prize can withdrawn if the committee have a good reason.

There is a similar call now for Gunter Grass who is a Nobel Prize Winner for Literature to give it up, because it was revealed that he was a former Nazi.

"Grass should give up Nobel Prize over SS past"
http://enjoyment.independent.co.uk/books/news/article1219250.ece

Now, before you submit your application to the Nobel Comittee, you must collect all the evidence from Physics peer review publications which have shown that 'cosmic background radiation' is bullshit. You will also have to collect the signatures of those Journalists & Historians , oops, sorry, the signatures of physicists who have published the refutations of the existence of 'cosmic background radiation'.

Berend , please inform me about the progress of your application to withdraw the Nobel Prizes from Prof. Penzias & Prof.Wilson.

Cheers.

Anonymous said...

Berend de Boer said...
[But that doesn't matter because Big Bang is a religion. Never try to argue facts with adherents of a "scientific" religion.]

Evidence is overwhelming, but I will not argue with you on that, because Big Bang is not a religion but facts. So, if you disagree, then show your evidence to counter the Big Bang claim.

Berend de Boer said...
[The Big Bang won't last the next 15 years.]

I am sorry but you are going to be wrong again here. There are current projects in Europe (CERN) and the US, to improve the technology for 'particle linear accelerators' so that precise measurements can be conducted relating to particle (nuclear) collisions typical in the early Universe or in other words a few (milli) seconds, just after the Big Bang. So, in 15 years time, the answers to the Big Bang other unsolved puzzles (apart from the ones that have already being observed) are going to be more precise, which will be bad news for you. I say that Big Bang will last more than 15 years. In short, it will last till the day Jesus Christ comes back for the 'Judgement Day'. WHY? Because Big Bang is a law of nature (physics) and as long as the material universe exists, Big Bang does also exist. The material universe will not exist in the second coming of Jesus Christ (Judgement Day) and that means all laws of physics cease to exist, including Big Bang. There are two pillars of Physics. 'General Theory of Relativity' (GTR) developed by Einstein where Big Bang derives from, and the other one is 'Quantum Mechanics' (QM). Modern electronic gadgets and semiconductor-based devices are built using QM principles; almost all electronic gadgets on the planet contain a semiconductor device. The current research are trying to combine the two together which is an emerging field called 'Quantum Gravity' (QG). GTR & QM are inconsistent in certain aspects (scales) but QG unites them in to a consistent framework. Where does Stephen Hawking fit in ? He is a QG theoretical physicist with a foot in QM and one foot in GTR , he is of course a reputable leading researcher in the field because of his published work.

But, then if you wait long enough for Jesus second coming, your wish might be granted, because Jesus is the boss of the 'laws of physics', where he can break any of them since he wrote them into the 'constitutional universal laws'. With your request, he (Jesus) might cross out Big Bang according to your wish, then 'ACT 1' , 'SECTION g' , 'CLAUSE 2' of the 'constitutional universal laws' is hereby deleted.

Berend de Boer said...

Fantastic guys, all good comments on the big bang. I'm gonna save that.

And I suggest you start reading actual physics journals instead of your high school text books.

Anonymous said...

Berend de Boer said...
[And I suggest you start reading actual physics journals instead of your high school text books.]

Yep, I do occasionally read some papers from different Physics Journals as to find out how to solve a particular algorithm. I just read about every readable journal that contains an equation (algorithm) available from the University library. From different physics review letters, Numerical analysis, Matrix algebra, Psycho-physics (psychology), Image Processing, Signal Processing, Machine Learning, Data Mining, Computational Economics, Computational Finance, Pattern Recognitions, Computer Vision, Statistics, Bio-informatics, Computer Graphics, Operations Research & Optimisation, Wavelet , Neural Information Processing, Artificial Intelligence, Multimedia Recognition, Soft-Computing, IEEE Control Systems, Intelligent Information Retrieval and many many more. I am no kidding Berend. I don’t read page to page, but read only the articles that are of interest from a specific volume. There could be one article from a whole volume of a specific journal that I read and the rest are uninteresting to me. I don’t read for the sake of reading, but once you read one paper for an algorithm of interesting, it refers to other papers from other journals which are listed at the end. Sometimes you have to jump from your first article to a second one from a different journal, because in order to comprehend a term that you come across in first few paragraphs of the first article, you can’t continue without understanding that term (or algorithm), so the first article is suspended while you dig on the second article for that term. If the concepts are new, you end up reading about 7 to 10 different articles (chain reaction) in order to understand algorithms (concepts) from the first article. In the field of technical computing, I have to do this amount of reading, because the type of contract work I do and the algorithms involved, are non-conventional (ie, algorithms I develop are slow to be published in bound books – may take 5 or more years since the date of their publications in peer review before they appear in bound books). If any software developer is using algorithms from a bound book, they are old ones. Nothing is wrong with old algorithms, but the latest improvements of the same algorithms must have been available at the same time from peer review papers. Say, Why develop a search algorithm which was published 10 years ago in peer review which has just come out in a bound book, instead of developing the same algorithm but much improved (speed & accuracy) version that is available in peer review.

The materials available in advanced textbooks come from physics journals, but there is a delay. Example, peer review papers may appear this year (2005) and you won't see it in any text-book perhaps after 5 years or more from the date of publication. This sort of delay happens in other disciplines as well and not just Physics.

There are a lot of advance computer algorithms, which requires deep understanding of physics and that is why I read some papers in certain physics publications. I go to Auckland Uni, once a week (2 hours at most) to browse thru the ‘display section’ (for new books & journals) at the Engineering library or Science section of the main library and take note of anything that might be of interest to my software development in the future. In case, I develop something where I got stuck and Google doesn’t help but remembered that I have seen such an algorithm in the library, then I can look it up (Journal title, year of publication, author/s, call number) in my compiled list of ‘interesting papers’. I know exactly of where to get certain periodicals or specific volume of journal since I already have the call number. Example, I was scoping for a solution to a problem in automated product classification (millions of them) for a project I got involved in, where current & traditional methods would have got high rates of error in miss-classification. Such huge number of products can’t be manipulated manually, since it will be error-prone and also requires a huge man resources and too costly. I fired some emails to the search engine research community mailing list describing the problem and I was alerted to the paper shown below. I remembered that I had seen the ‘title’ of the article at the Engineering School library but never took note as the wording of the title looked uninteresting. When I downloaded the paper, I realised I needed to dig in to my ‘Thermodynamics’ & ‘Statistical Mechanics’ text books which I still keep, for hints of how to code such complex model described in the paper. The paper uses models (kernels) from thermodynamics (heat flow or heat diffusion) for text search engine. It is impossible to read this paper without Physics knowledge. Now, I have a proto-type working codes of this model, where a I need to test its performance & accuracy on a large corpus of text data which can be downloaded for free from National Institutes of Standards & Technology (http://trec.nist.gov/data.html).

"Diffusion Kernels on Statistical Manifolds"
http://jmlr.csail.mit.edu/papers/volume6/lafferty05a/lafferty05a.pdf

Since I did my training in physics but turned software developer, my background perhaps helps me a lot, in doing what I am doing. I am able to decipher a complex model (or algorithm), and then turn it into a piece of workable software codes. I have developed my own Java library in computational finance, and I am regularly adding new codes in algorithm from Econo-Physics. I do this in my free time because I am eyeing a software start-up to develop high-end application for analysis & simulations of the financial market using my own developed finance library. Econo-Physics is the applications of Physics to Economics. It is a new discipline (perhaps 8 years old) and it looked to have solved some problems from traditional (standard) economics theory. Applying Physics in Economics is not new, however it was only recent that Physicists have turned their knowledge and applied to Economics. Professor Robert Merton and Professor Myron Scholes shared the Nobel Prize for economics in 1997 for their (& Fischer Black) effort in developing the “Black-Scholes” model of the financial markets. Their paper was published in 1973. The Black-Scholes models the market stock price movement using fluid dynamics technique of Brownian motions (developed by Einstein) which the solutions are solved by heat-equation & heat-flow of thermodynamics. Black-Scholes has similar kernels to the ‘search engine’ algorithm shown in the previous link.

“Merton-Scholes Nobel Prize in Economics”
http://nobelprize.org/economics/laureates/1997/press.html

“Black-Scholes : Using (Heat) Diffusion Equation”
http://www.physics.uci.edu/~silverma/bseqn/bs/

Econo-Physics is an exciting field in which I am following with interest. The Oxford Centre for Computational Finance have some interesting algorithm & model in econo-physics where they have developed working software that seemed to defy established economic theory. I wouldn’t be surprised if they win a Nobel Prize for developing their model in the future. What they are doing is revolutionary and non-conventional in economics. The well-known 'Sum-Over-Histories' formulations of Quantum Mechanics by Richard Feynman where he used path integral from differential calculus and statistical mechanics, has been successfully applied in analysis of financial markets. The following are interesting articles on physics & finance.

"Quantum physics meets classical finance"
http://www.physics.princeton.edu/www/jh/finance_and_physics_article_1.pdf

"Is Economics the Next Physical Science?"
http://www.physics.princeton.edu/www/jh/finance_and_physics_article_2.pdf

"Econophysics Hub"
http://www.moneyscience.org/tiki/tiki-index.php?page=Econophysics+Hub

"Econophysics Research: Publications"
http://polymer.bu.edu/~hes/econophysics/

"Econophysics Forum"
http://www.unifr.ch/econophysics/

"econophysics.org"
http://www.ge.infm.it/~ecph/index.php

Hine Te Po said...

"Falafulu Fisi said..."

Yes, but do you know how to boil an egg without a big bang?