GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
The bottom line is that the so-called Hubble Tension between what happens in the nearby Universe compared to the early Universe’s expansion remains a nagging puzzle for cosmologists. There may be something woven into the fabric of space that we don’t yet understand.
Does resolving this discrepancy require new physics? Or is it a result of measurement errors between the two different methods used to determine the rate of expansion of space?
Hubble and Webb have now tag-teamed to produce definitive measurements, furthering the case that something else — not measurement errors — is influencing the expansion rate.
“With measurement errors negated, what remains is the real and exciting possibility that we have misunderstood the Universe,” said Adam Riess, a physicist at Johns Hopkins University in Baltimore. Riess holds a Nobel Prize for co-discovering the fact that the Universe’s expansion is accelerating, owing to a mysterious phenomenon now called ‘dark energy’.
We've known for a long time that different branches of science give different values for the rate of expansion of the universe. That's a clear sign that something is wrong with our theories. Another problem for me is the invention of things like dark matter and dark energy to plug gaps in the theory. Whatever happened to Occam's razor?
Experiments are done to verify theories, not the other way around (inventing theories to justify the results).
Theoretical physicists bullsh... succesfully so far because of the limit of our instruments until ... JWST.
Another problem for me is the invention of things like dark matter and dark energy to plug gaps in the theory. Whatever happened to Occam's razor?
"the problem-solving principle that recommends searching for explanations constructed with the smallest possible set of elements." - Nothing "happened" to Occam's razor, we just haven't found an explanation with fewer elements than dark matter & energy yet.
Cosmic distances are calculated by more than one means. The effort to measure the distance to celestial objects began in Ancient Greece employing the Parallax Method which began by using the Earth's orbit, 6 months apart, as the base of a triangle used to construct an isosceles triangle and calculating the included angles, resulting in the perpendicular distance calculation. The first of course was our Moon. This works well out to a few parsecs. For greater distances there are several methods most depending on some "Standard Candle".
One such "candle" is the Type 1a SuperNova. Type 1a supernovae that are relatively close to us appear to have the same brightness relative to distance so they are very useful at nearby bodies but just how standard they are throughout the galaxy is sort of an educated assumption since we don't know if Mass is or was the sole contributor always to brightness. We try to compare several accepted "candles" to arrive at some level of average accuracy. These methods will improve over time as new technology develops and already have improved accuracy many fold over 19th and 20th Century measurements. It currently is still a bit like constructing the Pyramids with bits of string and a wheel.
The reason I mention distance is that those data limitations plus the limitations on our complete understanding of Gravity lead to differences creating substantial conflicts in such things as rate of expansion and the rotation speed of galaxies which led to the theories of Dark Matter and Dark Energy. Dark Matter was first hypothesized by Lord Kelvin in the late 19th Century.
By ~1930 a number of astronomers began to publish data on observations that simply could not be so with the then current understanding. This all took a leap into theory beginning with Fritz Zwicky in 1933 and furthered by numerous astronomers data throughout the early 20th Century and reached quite a tipping point with a few astronomers mid Century, perhaps most notably Vera Rubin in the late 60s and 70s.
I've left out a great deal (obviously) but the point is that these theories are each named "Dark" for a reason. Something is causing the ever mounting number of observed phenomena much as before molecules of gas were understood, few knew what "wind" is, only that it obviously exists. Similar progress occurred with simple Fire. Occam's Razor is in full effect. It's just that we are in the very primitive stages of figuring out precisely what is responsible. For example, one competing theory was (and I suppose still is among a few) MOND, which essentially proposes that gravity is variable. It, however, fails to explain numerous qualities that Dark Matter and Dark Energy handle with some aplomb.
With so many major advances taking place once we started building high altitude balloons and especially rockets and incredible telescopes, it's easy to forget that many textbooks as late as mid to late 1950s still taught the entirety of the Universe consisted of but one galaxy, ours.
"For the first time since November, NASA’s Voyager 1 spacecraft is returning usable data about the health and status of its onboard engineering systems. The next step is to enable the spacecraft to begin returning science data again." https://www.jpl.nasa.gov/news/nasas-...dates-to-earth
I haven't looked here in a while. When I posted this thread (Post #1), everyone took turns to flame me for doing so. But it now turns out you're actually having the sort of good-natured discussion I intended, as there are consequences (and casualties) because the JWST works .
I'm pleased that this thread has somewhat turned into a fact-based discussion, but there are still quite a few posts revealing the authors don't understand how the Scientific Method works or what actually motivates scientists.
JWST was designed on purpose NOT to confirm what we already theorize (that would actually be next to useless for scientists) but precisely to find flaws. THAT'S the revelation of future paths that garner grant money not to mention describes the cutting edge of what will refine those theories and later become the new standards... for a time.
In short, scientists are not at all upset that some initial data seems to fly in the face of some aspects and details of current theory. Most are overjoyed! After all that's why it was made.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.