03 Starlight

3 Starlight

3 Starlight, distance & time


A No one has been to any distant star in order to tell when the light left, so you can’t definitely be sure of the time of departure.

B Stars have remained visible in the same general positions since records began.


Star distances.

These are generally measured by red shifts, a change in the light spectrum as per the Doppler effect. This is generally accepted by most scientists, but not wholly. It is not proven & cannot be so.

It certainly seems that stars are millions of light years away & that the universe is expanding.

Did you know that the bible says that God ‘stretched out the heavens’ (the universe around us)?


Light travel times.


1 Light has been travelling at its current speed (3×109 m/sec or 186,000 miles/sec) throughout the past. This would mean that if stars are billions of kms away the universe is many million years old.


The bible indicates that the Earth/universe is about 6 thousand years old. Here are some explanations for this paradox:


2 God created the cosmos with light ‘in transit’. Starlight has been coming to the Earth since the beginning of time, a mature universe. God made Adam & Eve as adults at an actual age of naught.


3 The speed of light was much higher in the past. Historical records for measurements of the speed of light have shown a downturn to its current speed. If this was so then light from distant stars would reach Earth much faster than we would expect to find today.


Theoretical physics.


4 Euclidian & 4D space. This is a secular scientific concept where light from nearby sources (Euclidian) & but because of the bending of space light from distant sources travels in a shorter timescale (see below). Light from anywhere in the universe would arrive within 15 years.


5 White Hole relativity.

Relativity has shown time varies according to the gravity produced by large masses.

A Black Hole is a feature which sucks in matter & the gravity is so great that light cannot escape.

White Hole cosmology is the reverse of a Black Hole in that it expels all matter in accordance with Einstein’s relativity theory. All matter passing through the event horizon surrounding the hole, would undergo time dilation with the result that time outside passes much faster than inside. The Earth would be late in exiting & is now at or near the centre of the universe. While days on Earth pass, distant objects experience millions of years in local time. Light from stars would have vast amounts of time to travel to Earth.


6 Unidirectional travel time.

There is a hypothesis that light travel time is not unidirectional. When it travels towards an observer the speed is infinite. When travelling away it is half the accepted value (c/2).
Due to the immense speed of light & general relativity this cannot be verified or disproven.
However it would remove any problem of seeing stars, billions of light years away, in a 6 thousand year old universe.


Like most things science can provide explanations for observable effects. Unless they can be actually observed & repeatably tested they cannot be proven & must be believed or not believed.


Steve Martin


Additional information.

Does Light Travel in a Straight Line?

It is commonly believed that light travels in a straight line, which is essentially correct. Light explores all paths between locations, and the path of least resistance is chosen, which is usually the straight line. Light wishes to travel in a method that will yield the shortest time, and this is always a straight line in a three-dimensional universe and is in keeping with Fermat’s Principle of Least Time. The Least Time principle is always satisfied by a straight line in three-dimensional, Euclidean space, but the problem is that this is not the only way of understanding the universe.

Einstein introduced the idea of four-dimensional space, a universe that is not fixed and is constantly changing. If Fermat’s Principle is used with this in mind, light then travels instead in a geodesic, which is a straight line but on a curved surface. Light itself is a wave, so the idea that it literally travels in a straight line is more of a explanation used for human understanding rather than the exact reality of the situation. In short, in traditional understandings of the universe that incorporate Euclidean ideas as to the nature of space, light does, without a doubt, travel literally in a straight line, unless the path of least resistance is no longer a straight line due to masses that stand stand in the light’s way (this explains refraction via water, etc.). However, adopting the tenets supported by Einstein and the theory of four-dimensional space, which is now largely accepted, the shortest distance between two points rule still applies, but this shortest distance is no longer necessarily a straight line. This ties into Wave Theory of Light, a more recent theory in scientific circles.

Still, a straight line is a valid way to understand the way that light travels, irrespective of three-dimensional or four-dimensional space considerations, unless it is acted upon by an outside force or refracted in any way, such as gravity or electromagnetic forces, as may be seen when light passes close by a star or black hole. One simplified way of understanding how light can appear to travel in directly straight lines can be seen when considering shadows on a sunny day. If the day is partially cloudy, the shadows are blurry and indistinct, but on a fully sunny day when the light is not being refracted, shadows are defined.

Light may be seen and understood as traveling in a straight line, but modern theories suggest that, technically, matters work otherwise.


Reactions and Arguments:

Three reactions to the decrease in the measured value of c were summarized by Dorsey, after admitting that the idea of c decay had ‘called forth many papers.’ He stated that ‘Not a few of their authors seem to be very favorably impressed by the idea of a secular variation, some seem to be favorable to it but unwilling to commit themselves, and some are strongly critical.’ Dorsey himself was in the last category as eventually was R.T. Birge. Nevertheless, in 1941 even Birge acknowledged that ‘these older results are entirely consistent among themselves, but their average is nearly 100 km/s greater than that given by the eight more recent results’. In this, history repeated itself. In 1886, Newcomb, who had obtained some of those ‘older results’ mentioned by Birge, stated that the still older results around 1740 were also consistent but placed c about 1% higher than in his own time.

This persistent trend was countered by three arguments. Initially, it was deemed contrary to Einsteinian theory, but, as indicated above, the truth appears to be otherwise. The second argument recognized, as Newcomb and Birge’s statements do, that the measured values of c were differing with time. Dorsey3 proposed in 1944 that perhaps the measuring equipment was at fault or that it was an artifact of more sophisticated procedures. However, his lengthy analysis still left the early c values above c now. He concluded that all measurements prior to 1928 were unreliable, extended their error limits, and claimed that c decay could be rejected on these grounds.

However, Dorsey did not address the main problem. He failed to demonstrate why the measured values of c should show a systematic trend with the mutual unreliability of the equipment. Indeed, if c was constant, error theory indicates that there should have been a random scatter about a fixed value. This is not observed. Instead, the analysis below shows a statistical decay trend for c measured by 16 different methods, individually as well as collectively. This tends to negate Dorsey’s contention since it represents one chance in 43 million of being the coincidence that he might have implied (trends could be increasing, decreasing or static). Furthermore, in the seven instances where the same equipment was used in a later series of experiments, a lower c value has always resulted at the later date. Dorsey had no satisfactory explanation for this phenomenon.

Birge gave a third reason for rejecting c decay. After noting that wavelengths and length standards were experimentally invariant over time, he stated that ‘if the value of c…is actually changing with time, but the value of (wavelength) in terms of the standard meter shows no corresponding change, then it necessarily follows that the value of every atomic frequency…must be changing. Such a variation is obviously most improbable….’ Ironically, this is the very effect that Van Flandern observed experimentally. Indeed, the analysis below shows that when the basic equations are worked through with energy conservation in mind, the conclusion emerges that the emitted frequency of light from atoms is the quantity varying with c and wavelengths do remain unchanged. The constraint of energy conservation based on constant length standards (including dynamical and atomic distances) alone appears to give predicted trends in the values of other atomic constants that are consistent with measurement and observation. As Birge pointed out in his article, invariant length and wavelength standards are upheld experimentally.

More recently, it has been suggested that measured values became ‘locked’ around some canonical value, an effect called ‘intellectual phase locking’. This hardly accounts for the confirmatory trends in other atomic constants, nor the lower values obtained when the same c-measuring equipment was used for a later experiment. Dorsey’s reworked results also deny it. Furthermore, when many of the measurements were being made, c behavior was still a matter for debate and appropriate descriptive curves were discussed.

However, since the 1940’s, a different attitude to the value of c has prevailed which may itself be a form of intellectual phase-locking. As one reviewer pointed out, Aslakson’s measurements with the ‘SHORAN’ navigation system in 1949 required a higher value for c than was currently accepted to agree with geodetic distances. He delayed publication for several years while he sought for supposed errors in his system. As it turned out, his experimental value was correct, within its error limits, and the accepted c value was too low for reasons discussed later. The importance of experimental results compared with accepted norms is thereby well illustrated.

Accordingly, it seems appropriate to re-examine all experimental determinations of c and related atomic quantities to establish what these results actually reveal. The initial results of the investigation are hereby presented.


Roemer-type Experiments




Value of c


*Roemer 1673 ± 5         ? 317,700 Corrected method
*Newton 1706   480 311,660 Approximate value
   Delambre 1738 ± 71   493.2 303,320 ± 310 Mean of 1000 observations
   Martin 1759   493.0 303,440
   Price 1770   492.0 304,060
   Encyc. Brit. 1771   495.0 302,220 Accepted value
  #Bode 1778   487.5 306,870
  #Boscovich 1785   486.0 307,810
   Glasenapp 1861 ± 13   498.57 300,050 ± 60 Mean of 320 observations
   Sampson 1877 ± 32   498.64 300,011 Private reduction
   Harvard 1877 ± 32   498.79 299.921 ± 13 Harvard reduction