Alright, I thought you were going to hit me with some heavy physics stuff.
I think if light speed increased, then time would remain constant. Light speed has remained stable for many decades now since were at the linear edge of the curve calculated by Setterfield. To observe what your analogy is referring to, we would have to go back in time when light speed was dropping exponentially given the assumption that his hypothesis can be extrapolated back in time.
What time are you saying would remain constant? Remember the whole point of the c-decay idea is to come up with a way for light from distant stars emitted 6000 years ago to cross 12 billion light years of distance. The idea requires that the light we are seeing from the most distant stars was first sent on it's way back when light was travelling much faster, not when it was on the flatline part of Setterfield's curve. If the light from distant stars was emitted when lightspeed was stable then we'd be unable to see it since according to creationists 12 billion years hasn't passed. (Don't you think it's odd that light speed flatlines the moment we develop accurate ways to measure it?)
We don't need to go back in time at all. I'll step through the example, please identify where you think time travel is necessary or where I'm making some unjustified assumption.
step 1. Assume that at some point in the past light moved faster that it does now
step 2. A bit of light is emitted by the star in the past
step 3. This bit of light travels at the faster rate for 1 second.
step 4. A second bit of light is emitted by the star in the past
step 5. This second bit of light is some distance D behind the first bit of light where D = (fast speed of light per second) x (1 second).
step 6. Gradually the speed of light decreases, both bits of light get slower and slower.
step 7. The distance between the bits of light never changes because the speed of light is the same for both bits of light.
step 8. When the bits of light reach earth in the present, c has decayed to present day value. The first bit of light hits your eye.
step 9. The second bit of light hits your eye some time T later where T = D/(current speed of light).
step 10. T is greater than 1 second since T is simply the ratio of the starting speed to the current speed. T = (fast speed of light)/(current speed) x 1 second
For distant stars, this ratio needs to be exceptionally large in order for light to reach us in 6000 years, which means the slow down effect is correspondingly large.
The problem youÃ¢â‚¬â„¢re having here is that youÃ¢â‚¬â„¢re going to assume the untenable Ã¢â‚¬Å“uniformitarianÃ¢â‚¬Â conclusion that the speed of light is, as it has always been, AND that there is absolutely nothing between those stars and us, that would affect the speed of light. Therefore your attempt at an irrelevance argument is itself irrelevant.
Could you please show where I've assumed that the speed of light is constant? If you look at my examples I've actually assumed the opposite. In the examples I start by assuming that light has not been a constant speed and then show how the consequences of that assumption are inconsistent with direct observations of both rotating stars and radioactive decay.
If there's something slowing light down between stars and us that just makes the problem of distant starlight worse for creationists. We can ignore the idea of some general thing that makes light faster in space but not on earth for several reasons. First because the burden of proof would be on the claimant and since nobody has produced any reason to think such a thing exists, that burden has not been met. Second because light that is reflected toward earth crosses distances from source to reflector at rates consistent with the current earth based speed of c. Such a concept is unrelated to c-decaying so I'm not going to spend too much time on it. If you want to propose something that makes c faster between stars but not on earth I'd be interested in seeing your evidence.
Having said that, yes, we are observing historical light from those stars; BUT we donÃ¢â‚¬â„¢t know that history. Therefore, whatever numbers you attempt to promulgate are assumptive.
The light is the history, that's the whole point. If light slowed down, the rate at which the light from stellar events arrives at earth would be different in identifiable ways compared to the rate at which it would arrive if light were constant. To summarize the examples I've presented into a single testable statement, if lightspeed was fast enough in the past to allow light to cross the universe in 6000 years, it would be impossible for us to observe millisecond pulsars at far distances because it would mean they were actually spinning too fast to exist. We do observe millisecond pulsars at far distances.
However considering our small amount of time to study stars and such, how would we be able to determine decay rates from super novas, without first making an assumption on what it "used" to be.
The isotopes studied have half lives of days or weeks here on earth, we can watch the production and decay in stars over the course of a few months and verify that the rates look the same in space as they do on earth.
Yes something can be speeding up whilst at the same time is slowed down.. However if the speed of light is slowING (notice the present tense here) then it is a contradiction to say that it is speeding up at the same time that it is slowing down. Your analogy only applies if light was slowed down but then started to speed up again. This would only apply if the speed of light was a random variable, do you claim that the speed of light is a random variable?
It is not the speed of light that is speeding up, it is the events that light shows which are speeding up. We'd see a star spinning slightly faster every year, because there would be a shorter interval between the arrival of each batch of photons that show the star rotating. Each batch of photons would be moving at a slightly slower speed than last years batch (assuming c is continuing to decay), they'd just be a smaller distance apart than last years batch which means they'd hit your eyes at a faster rate.
Look at the example at the start of this post. D = (fast speed of light per second) x (1 second) and T = D/(current speed of light). In the past as the speed of light decayed, the 'fast speed of light per second' would get smaller. This means D would get smaller and means that T is smaller. Things would look like they are gradually happening faster because each photon we see started out slower than the previous one.