Jump to content
IGNORED

Radiometric Dating


ARGOSY

Recommended Posts


  • Group:  Diamond Member
  • Followers:  1
  • Topic Count:  48
  • Topics Per Day:  0.01
  • Content Count:  1,363
  • Content Per Day:  0.35
  • Reputation:   403
  • Days Won:  5
  • Joined:  08/01/2013
  • Status:  Offline

 

 

 

So, we are mistaken about the number of daughter particles, in each sample for each isotope, in the *exact amount* to cause us to converge on a coherent picture that just happens to date the earth to billions of years old? That doesn't really answer my challenge. There are thousands of these tests done now, using multiple different radioactive isotopes. What you suggest is that there is a massive coincidence. Maybe God put just the right amount of daughter nuclei in with every parent so that it would look like some strata is 60 million years old, and that it will look like that for every test you do on a sample in that strata, but really it's only 10k after all? And then that is true for all the various strata that are tested. Hopefully you'll see why I don't take such a suggestion as feasible.

 

When you wrote this, I assumed you were responding to Enoch and so I didn't reply.

 

There's no co-incidence, I believe radiometric dating is an accurate reflection of relative dates. When there is a variance in radiation (eg solar flares/July/midnight, 33 day cycle) there is an observed and scientifically tested decay variance across all unstable isotopes, both alpha and beta decay. That is what the studies indicate. So of course there is consilience because decay rates vary in sync with eachother due to  the radiation effect having a universal effect among all unstable isotopes.

 

Okay, and you think that will make up for the 4 billion year shortfall just exactly so? Have you tried looking at the number of events, the intensity of events, you need in 6000 yrs to make it seem that way given the rather small variance?

 

 

I'm not a YEC, I think that this effect could very well make up for the 600 million year shortfall since life was detected.  The focus of studies has been short-life isotopes, the detected effect would be logically higher for long-life isotopes. 

 

The idea is that for slow decaying isotopes, the slight energising effect of background radiation could maintain nearly every atom of the parent isotope in an energised/unstable state. The loss of energy nearly equals the gain of energy and this equilibrium makes decay very slow.  Remove the energising effect of background radiation, and the parent isotope will suddenly decay at much higher rates. A factor of 100 000 is not impossible, we will have to start counting atoms in a sample to get to the bottom of the possibilities.  

 

Okay, so you aren't YEC, but you think that life was created 6k years ago? I just want to be sure I understand your position correctly.

 

As to the rest, have you looked at the rates of variation that have been published? It wouldn't be hard to look at them, and calculate a maximum. Also you'd need to be sure all rates are affected the same way. While you are saving yourself an order of magnitude, between 4.5 billion years and 600 million years, you are still looking at 5 zeros of difference between 6,000 yrs and 600,000,000 years. From what I have seen of the rates that radioactive decay is altered by non destructive radiation there is no way you are going to come close to this.

Link to comment
Share on other sites


  • Group:  Diamond Member
  • Followers:  2
  • Topic Count:  10
  • Topics Per Day:  0.00
  • Content Count:  1,695
  • Content Per Day:  0.45
  • Reputation:   583
  • Days Won:  2
  • Joined:  01/03/2014
  • Status:  Offline
  • Birthday:  04/11/1968

 

 

 

 

So, we are mistaken about the number of daughter particles, in each sample for each isotope, in the *exact amount* to cause us to converge on a coherent picture that just happens to date the earth to billions of years old? That doesn't really answer my challenge. There are thousands of these tests done now, using multiple different radioactive isotopes. What you suggest is that there is a massive coincidence. Maybe God put just the right amount of daughter nuclei in with every parent so that it would look like some strata is 60 million years old, and that it will look like that for every test you do on a sample in that strata, but really it's only 10k after all? And then that is true for all the various strata that are tested. Hopefully you'll see why I don't take such a suggestion as feasible.

 

When you wrote this, I assumed you were responding to Enoch and so I didn't reply.

 

There's no co-incidence, I believe radiometric dating is an accurate reflection of relative dates. When there is a variance in radiation (eg solar flares/July/midnight, 33 day cycle) there is an observed and scientifically tested decay variance across all unstable isotopes, both alpha and beta decay. That is what the studies indicate. So of course there is consilience because decay rates vary in sync with eachother due to  the radiation effect having a universal effect among all unstable isotopes.

 

Okay, and you think that will make up for the 4 billion year shortfall just exactly so? Have you tried looking at the number of events, the intensity of events, you need in 6000 yrs to make it seem that way given the rather small variance?

 

 

I'm not a YEC, I think that this effect could very well make up for the 600 million year shortfall since life was detected.  The focus of studies has been short-life isotopes, the detected effect would be logically higher for long-life isotopes. 

 

The idea is that for slow decaying isotopes, the slight energising effect of background radiation could maintain nearly every atom of the parent isotope in an energised/unstable state. The loss of energy nearly equals the gain of energy and this equilibrium makes decay very slow.  Remove the energising effect of background radiation, and the parent isotope will suddenly decay at much higher rates. A factor of 100 000 is not impossible, we will have to start counting atoms in a sample to get to the bottom of the possibilities.  

 

Okay, so you aren't YEC, but you think that life was created 6k years ago? I just want to be sure I understand your position correctly.

 

As to the rest, have you looked at the rates of variation that have been published? It wouldn't be hard to look at them, and calculate a maximum. Also you'd need to be sure all rates are affected the same way. While you are saving yourself an order of magnitude, between 4.5 billion years and 600 million years, you are still looking at 5 zeros of difference between 6,000 yrs and 600,000,000 years. From what I have seen of the rates that radioactive decay is altered by non destructive radiation there is no way you are going to come close to this.

 

 

Yes that is correct, I'm a YBC, young biology creationist.

 

You cannot be sure of your guesswork unless you produce figures. Until then the assumption that small cause =small effect therefore large cause=large effect is more logical than the assumption that small cause=small effect therefore large cause =small effect.   

 

Let me illustrate by an approximate analogy:

Imagine we have two bottles each containing a billion marbles and a small hole at the bottom draining the marbles. The first bottle is draining at 100 million marbles a day, the second bottle is draining at 1 million marbles a day.  The drain time is therefore 10 days and ~1 000 days respectively. In the analogy this represents the natural rate of parent isotope decay without radiation.  

 

Then imagine that we replace 999 999 marbles every day into both bottles. This has the effect of producing a 1% change to the short life bottle of marbles. It will drain in ten days and a bit, hardly any change at all. But in the long life bottle of marbles this causes a change from 1 million marbles a day, to 1 marble a day.   The drain rate appears to be extremely slow, just like modern decay appears extremely slow in long-life isotopes. Remove the daily added marbles and suddenly the drain rate changes by a factor of a million.

 

This is what is happening to isotopes at the moment, the radiation is energising the parent isotopes, keeping them in an unstable state. They are unable to decay because the loss of energy over time is approximately equal to the gain of radiation every day, and we have very few actual atoms that experience decay. Block out that radiation and we could have a 100 000 fold increase in decay rates. This is the possibility/likelihood I'm referring to.

Link to comment
Share on other sites


  • Group:  Diamond Member
  • Followers:  1
  • Topic Count:  48
  • Topics Per Day:  0.01
  • Content Count:  1,363
  • Content Per Day:  0.35
  • Reputation:   403
  • Days Won:  5
  • Joined:  08/01/2013
  • Status:  Offline

 

 

 

 

 

So, we are mistaken about the number of daughter particles, in each sample for each isotope, in the *exact amount* to cause us to converge on a coherent picture that just happens to date the earth to billions of years old? That doesn't really answer my challenge. There are thousands of these tests done now, using multiple different radioactive isotopes. What you suggest is that there is a massive coincidence. Maybe God put just the right amount of daughter nuclei in with every parent so that it would look like some strata is 60 million years old, and that it will look like that for every test you do on a sample in that strata, but really it's only 10k after all? And then that is true for all the various strata that are tested. Hopefully you'll see why I don't take such a suggestion as feasible.

 

When you wrote this, I assumed you were responding to Enoch and so I didn't reply.

 

There's no co-incidence, I believe radiometric dating is an accurate reflection of relative dates. When there is a variance in radiation (eg solar flares/July/midnight, 33 day cycle) there is an observed and scientifically tested decay variance across all unstable isotopes, both alpha and beta decay. That is what the studies indicate. So of course there is consilience because decay rates vary in sync with eachother due to  the radiation effect having a universal effect among all unstable isotopes.

 

Okay, and you think that will make up for the 4 billion year shortfall just exactly so? Have you tried looking at the number of events, the intensity of events, you need in 6000 yrs to make it seem that way given the rather small variance?

 

 

I'm not a YEC, I think that this effect could very well make up for the 600 million year shortfall since life was detected.  The focus of studies has been short-life isotopes, the detected effect would be logically higher for long-life isotopes. 

 

The idea is that for slow decaying isotopes, the slight energising effect of background radiation could maintain nearly every atom of the parent isotope in an energised/unstable state. The loss of energy nearly equals the gain of energy and this equilibrium makes decay very slow.  Remove the energising effect of background radiation, and the parent isotope will suddenly decay at much higher rates. A factor of 100 000 is not impossible, we will have to start counting atoms in a sample to get to the bottom of the possibilities.  

 

Okay, so you aren't YEC, but you think that life was created 6k years ago? I just want to be sure I understand your position correctly.

 

As to the rest, have you looked at the rates of variation that have been published? It wouldn't be hard to look at them, and calculate a maximum. Also you'd need to be sure all rates are affected the same way. While you are saving yourself an order of magnitude, between 4.5 billion years and 600 million years, you are still looking at 5 zeros of difference between 6,000 yrs and 600,000,000 years. From what I have seen of the rates that radioactive decay is altered by non destructive radiation there is no way you are going to come close to this.

 

 

Yes that is correct, I'm a YBC, young biology creationist.

 

You cannot be sure of your guesswork unless you produce figures. Until then the assumption that small cause =small effect therefore large cause=large effect is more logical than the assumption that small cause=small effect therefore large cause =small effect.   

 

Let me illustrate by an approximate analogy:

Imagine we have two bottles each containing a billion marbles and a small hole at the bottom draining the marbles. The first bottle is draining at 100 million marbles a day, the second bottle is draining at 1 million marbles a day.  The drain time is therefore 10 days and ~1 000 days respectively. In the analogy this represents the natural rate of parent isotope decay without radiation.  

 

Then imagine that we replace 999 999 marbles every day into both bottles. This has the effect of producing a 1% change to the short life bottle of marbles. It will drain in ten days and a bit, hardly any change at all. But in the long life bottle of marbles this causes a change from 1 million marbles a day, to 1 marble a day.   The drain rate appears to be extremely slow, just like modern decay appears extremely slow in long-life isotopes. Remove the daily added marbles and suddenly the drain rate changes by a factor of a million.

 

This is what is happening to isotopes at the moment, the radiation is energising the parent isotopes, keeping them in an unstable state. They are unable to decay because the loss of energy over time is approximately equal to the gain of radiation every day, and we have very few actual atoms that experience decay. Block out that radiation and we could have a 100 000 fold increase in decay rates. This is the possibility/likelihood I'm referring to.

 

Well, let's say that we increase the rate of radioactive decay by 1%. Let's just imagine we have a sample we understand the halflife to be a billion years. The associated decay rate to put into the exponential decay formula becomes 6.93e-10, a 1% increase, rounded up to 6.97e-10. Now I will put this new rate in and see how much time it takes to get 50% of the original population, 994,472,282 years. Hopefully you can see my problem here. A 1% increase in rate is pretty big and yielded a halflife relatively close to a billion years-, so how much of a change in decay rate do we need to bridge the gap between 1 billion years and 10k years? The decay rate for an isotope to have a 10k half life is 6.93e-05. This a five orders of magnitude larger decay rate. Do you have *anything* which suggests such an enormous change in decay rate?

 

Now about your illustration- this is an enormous change in decay rate that we'd surely notice by altering the environment! and which experimentalists have done. But to address your example, suppose we start with 6e23 nuclei, the billion year half life, after a year, leaves us with 5.9999984e+23 parent nuclei, with the 10k rate 5.99958e+23 a difference of 4e19 nuclei-- certainly a distinguishable circumstance!

 

http://phys.org/news202456660.html check this out, gives a tenth of a percent variation. Very interesting but isn't going to give the sort of effect you'd need. What sort of environment do you think there would have to be, for the last 6k years to, in principle, make up for a 5 order of magnitude difference in rates, consistently across the board? I have to emphasize this last part also, since different nuclei have different physical ways they decay so I would not expect the same thing to affect them all in precisely the same manner.

 

I'm confused as to *why* we should think the very recent past was radically different from this moment in time. The only reason I can think you'd propose that is because you are already convinced of some sort of young earth (or young biology here) case.

Link to comment
Share on other sites

I'm confused as to *why* we should think the very recent past was radically different from this moment in time.

 

:thumbsup:

 

Beloved, I'm Confused

 

Oh that the salvation of Israel were come out of Zion! When God bringeth back the captivity of his people, Jacob shall rejoice, and Israel shall be glad. Isaiah 53:6 (KJV)

 

As To Why You Don't

 

And Jesus Christ our Lord was shown to be the Son of God when God powerfully raised him from the dead by means of the Holy Spirit. Through Christ, God has given us the privilege and authority to tell Gentiles everywhere what God has done for them, so that they will believe and obey him, bringing glory to his name. Romans 1:4-5 (NLT)

Link to comment
Share on other sites


  • Group:  Diamond Member
  • Followers:  1
  • Topic Count:  48
  • Topics Per Day:  0.01
  • Content Count:  1,363
  • Content Per Day:  0.35
  • Reputation:   403
  • Days Won:  5
  • Joined:  08/01/2013
  • Status:  Offline

 

I'm confused as to *why* we should think the very recent past was radically different from this moment in time.

 

:thumbsup:

 

Beloved, I'm Confused

 

Oh that the salvation of Israel were come out of Zion! When God bringeth back the captivity of his people, Jacob shall rejoice, and Israel shall be glad. Isaiah 53:6 (KJV)

 

As To Why You Don't

 

And Jesus Christ our Lord was shown to be the Son of God when God powerfully raised him from the dead by means of the Holy Spirit. Through Christ, God has given us the privilege and authority to tell Gentiles everywhere what God has done for them, so that they will believe and obey him, bringing glory to his name. Romans 1:4-5 (NLT)

 

That is an interesting challenge Joe.

Link to comment
Share on other sites

Faithful

 

But unto the Son he saith, Thy throne, O God, is for ever and ever: a sceptre of righteousness is the sceptre of thy kingdom. Thou hast loved righteousness, and hated iniquity; therefore God, even thy God, hath anointed thee with the oil of gladness above thy fellows. And, Thou, Lord, in the beginning hast laid the foundation of the earth; and the heavens are the works of thine hands: Hebrews 1:8-10

 

And

 

Through faith we understand that the worlds were framed by the word of God, so that things which are seen were not made of things which do appear. Hebrews 11:3

 

True

 

But without faith it is impossible to please him: for he that cometh to God must believe that he is, and that he is a rewarder of them that diligently seek him. Hebrews 11:6

 

~

 

I'm confused as to *why* we should think the very recent past was radically different from this moment in time.

 

:thumbsup:

 

Beloved, I'm Confused

 

Oh that the salvation of Israel were come out of Zion! When God bringeth back the captivity of his people, Jacob shall rejoice, and Israel shall be glad. Isaiah 53:6 (KJV)

 

As To Why You Don't

 

And Jesus Christ our Lord was shown to be the Son of God when God powerfully raised him from the dead by means of the Holy Spirit. Through Christ, God has given us the privilege and authority to tell Gentiles everywhere what God has done for them, so that they will believe and obey him, bringing glory to his name. Romans 1:4-5 (NLT)

 

That is an interesting challenge Joe.

 

:thumbsup:

 

Radically Unchanged

 

And now, O Father, glorify thou me with thine own self with the glory which I had with thee before the world was. John 17:5

 

Jesus Christ The Same Yesterday

 

Unto me, who am less than the least of all saints, is this grace given, that I should preach among the Gentiles the unsearchable riches of Christ; And to make all men see what is the fellowship of the mystery, which from the beginning of the world hath been hid in God, who created all things by Jesus Christ: Ephesians 3:8-9

 

Today

 

Thou art worthy, O Lord, to receive glory and honour and power: for thou hast created all things, and for thy pleasure they are and were created. Revelation 4:11

 

And Tomorrow

 

And I beheld, and I heard the voice of many angels round about the throne and the beasts and the elders: and the number of them was ten thousand times ten thousand, and thousands of thousands; Saying with a loud voice, Worthy is the Lamb that was slain to receive power, and riches, and wisdom, and strength, and honour, and glory, and blessing. And every creature which is in heaven, and on the earth, and under the earth, and such as are in the sea, and all that are in them, heard I saying, Blessing, and honour, and glory, and power, be unto him that sitteth upon the throne, and unto the Lamb for ever and ever. Revelation 5:11-13

 

~

 

Be Blessed Beloved Of The KING

 

The LORD bless thee, and keep thee:
The LORD make his face shine upon thee, and be gracious unto thee:
The LORD lift up his countenance upon thee, and give thee peace.

 

And they shall put my name upon the children of Israel; and I will bless them. Numbers 6:24-27

 

Love, Your Brother Joe

Link to comment
Share on other sites


  • Group:  Diamond Member
  • Followers:  2
  • Topic Count:  10
  • Topics Per Day:  0.00
  • Content Count:  1,695
  • Content Per Day:  0.45
  • Reputation:   583
  • Days Won:  2
  • Joined:  01/03/2014
  • Status:  Offline
  • Birthday:  04/11/1968

 

 

Well, let's say that we increase the rate of radioactive decay by 1%. Let's just imagine we have a sample we understand the halflife to be a billion years. The associated decay rate to put into the exponential decay formula becomes 6.93e-10, a 1% increase, rounded up to 6.97e-10. Now I will put this new rate in and see how much time it takes to get 50% of the original population, 994,472,282 years. Hopefully you can see my problem here. A 1% increase in rate is pretty big and yielded a halflife relatively close to a billion years-, so how much of a change in decay rate do we need to bridge the gap between 1 billion years and 10k years? The decay rate for an isotope to have a 10k half life is 6.93e-05. This a five orders of magnitude larger decay rate. Do you have *anything* which suggests such an enormous change in decay rate?

 

Now about your illustration- this is an enormous change in decay rate that we'd surely notice by altering the environment! and which experimentalists have done. But to address your example, suppose we start with 6e23 nuclei, the billion year half life, after a year, leaves us with 5.9999984e+23 parent nuclei, with the 10k rate 5.99958e+23 a difference of 4e19 nuclei-- certainly a distinguishable circumstance!

 

http://phys.org/news202456660.html check this out, gives a tenth of a percent variation. Very interesting but isn't going to give the sort of effect you'd need. What sort of environment do you think there would have to be, for the last 6k years to, in principle, make up for a 5 order of magnitude difference in rates, consistently across the board? I have to emphasize this last part also, since different nuclei have different physical ways they decay so I would not expect the same thing to affect them all in precisely the same manner.

 

I'm confused as to *why* we should think the very recent past was radically different from this moment in time. The only reason I can think you'd propose that is because you are already convinced of some sort of young earth (or young biology here) case.

 

 

 

You say that experimentalists have tested for this effect. Please post a link. When the constancy of decay rates was tested, they did not account for solar/cosmic particles like muons that penetrate their neutron shield creating a neutron/radiation background from within the sample.

 

Fact:  The earth's magnetic field over the last 6000 years fluctuated up and down, but it was predominately significantly stronger than today

Fact:  The earth's magnetic field has a strong shielding effect on solar/cosmic radiation

Fact:  Solar and cosmic radiation cause the earth's surface background radiation

Fact:  Even the slightest decreases in just solar radiation (33 day cycle, 26 day cycle, January) have a slight accelerating effect on decay.

 

So I will repeat my point, it is fact that slight decreases in just solar radiation cause slight accelerations to decay. When the earths magnetic field was a lot stronger during the last 6000 years, there would have been a dramatic decrease in both solar and cosmic radiation and we can therefore expect a dramatic acceleration of decay rates. 

 

This is the reason I believe the recent past (~200AD and earlier) was radically different from this moment in time.

Link to comment
Share on other sites


  • Group:  Diamond Member
  • Followers:  1
  • Topic Count:  48
  • Topics Per Day:  0.01
  • Content Count:  1,363
  • Content Per Day:  0.35
  • Reputation:   403
  • Days Won:  5
  • Joined:  08/01/2013
  • Status:  Offline

 

 

 

Well, let's say that we increase the rate of radioactive decay by 1%. Let's just imagine we have a sample we understand the halflife to be a billion years. The associated decay rate to put into the exponential decay formula becomes 6.93e-10, a 1% increase, rounded up to 6.97e-10. Now I will put this new rate in and see how much time it takes to get 50% of the original population, 994,472,282 years. Hopefully you can see my problem here. A 1% increase in rate is pretty big and yielded a halflife relatively close to a billion years-, so how much of a change in decay rate do we need to bridge the gap between 1 billion years and 10k years? The decay rate for an isotope to have a 10k half life is 6.93e-05. This a five orders of magnitude larger decay rate. Do you have *anything* which suggests such an enormous change in decay rate?

 

Now about your illustration- this is an enormous change in decay rate that we'd surely notice by altering the environment! and which experimentalists have done. But to address your example, suppose we start with 6e23 nuclei, the billion year half life, after a year, leaves us with 5.9999984e+23 parent nuclei, with the 10k rate 5.99958e+23 a difference of 4e19 nuclei-- certainly a distinguishable circumstance!

 

http://phys.org/news202456660.html check this out, gives a tenth of a percent variation. Very interesting but isn't going to give the sort of effect you'd need. What sort of environment do you think there would have to be, for the last 6k years to, in principle, make up for a 5 order of magnitude difference in rates, consistently across the board? I have to emphasize this last part also, since different nuclei have different physical ways they decay so I would not expect the same thing to affect them all in precisely the same manner.

 

I'm confused as to *why* we should think the very recent past was radically different from this moment in time. The only reason I can think you'd propose that is because you are already convinced of some sort of young earth (or young biology here) case.

 

 

 

You say that experimentalists have tested for this effect. Please post a link. When the constancy of decay rates was tested, they did not account for solar/cosmic particles like muons that penetrate their neutron shield creating a neutron/radiation background from within the sample.

 

Fact:  The earth's magnetic field over the last 6000 years fluctuated up and down, but it was predominately significantly stronger than today

Fact:  The earth's magnetic field has a strong shielding effect on solar/cosmic radiation

Fact:  Solar and cosmic radiation cause the earth's surface background radiation

Fact:  Even the slightest decreases in just solar radiation (33 day cycle, 26 day cycle, January) have a slight accelerating effect on decay.

 

So I will repeat my point, it is fact that slight decreases in just solar radiation cause slight accelerations to decay. When the earths magnetic field was a lot stronger during the last 6000 years, there would have been a dramatic decrease in both solar and cosmic radiation and we can therefore expect a dramatic acceleration of decay rates. 

 

This is the reason I believe the recent past (~200AD and earlier) was radically different from this moment in time.

 

Are you unaware that decay rates have been tested under a variety of extreme conditions? Pressures, temperatures, and yes, radiation...

 

Are you aware that radioactive decay involves different types of physical processes, some of which will be affected in different ways by the conditions you mention? That would put a serious wrench in your supposition that decay rates of different isotopes would be affected in identical ways. If you look again at the numbers, you will see that the amount of change that would be needed to account for the shortfall you need are *enormous* and would have shown up in the experimental work. On the other hand, some of the variations that have been found, as in the link I provided you, are much too small to help your case. Variations in the earth's magnetic field, and accompanying fluctuations in cosmic ray radiation, I submit, will not suffice.

 

I will say this much. What I am arguing now depends on the physical rules of the universe to always have been precisely what they are now. Maybe that is what you should not give up at the get-go?

 

I will bow out now though, from this particular exchange with you. I will read whatever you have to say in response but leave it at that.

Link to comment
Share on other sites


  • Group:  Diamond Member
  • Followers:  2
  • Topic Count:  10
  • Topics Per Day:  0.00
  • Content Count:  1,695
  • Content Per Day:  0.45
  • Reputation:   583
  • Days Won:  2
  • Joined:  01/03/2014
  • Status:  Offline
  • Birthday:  04/11/1968

 

 

 

 

Well, let's say that we increase the rate of radioactive decay by 1%. Let's just imagine we have a sample we understand the halflife to be a billion years. The associated decay rate to put into the exponential decay formula becomes 6.93e-10, a 1% increase, rounded up to 6.97e-10. Now I will put this new rate in and see how much time it takes to get 50% of the original population, 994,472,282 years. Hopefully you can see my problem here. A 1% increase in rate is pretty big and yielded a halflife relatively close to a billion years-, so how much of a change in decay rate do we need to bridge the gap between 1 billion years and 10k years? The decay rate for an isotope to have a 10k half life is 6.93e-05. This a five orders of magnitude larger decay rate. Do you have *anything* which suggests such an enormous change in decay rate?

 

Now about your illustration- this is an enormous change in decay rate that we'd surely notice by altering the environment! and which experimentalists have done. But to address your example, suppose we start with 6e23 nuclei, the billion year half life, after a year, leaves us with 5.9999984e+23 parent nuclei, with the 10k rate 5.99958e+23 a difference of 4e19 nuclei-- certainly a distinguishable circumstance!

 

http://phys.org/news202456660.html check this out, gives a tenth of a percent variation. Very interesting but isn't going to give the sort of effect you'd need. What sort of environment do you think there would have to be, for the last 6k years to, in principle, make up for a 5 order of magnitude difference in rates, consistently across the board? I have to emphasize this last part also, since different nuclei have different physical ways they decay so I would not expect the same thing to affect them all in precisely the same manner.

 

I'm confused as to *why* we should think the very recent past was radically different from this moment in time. The only reason I can think you'd propose that is because you are already convinced of some sort of young earth (or young biology here) case.

 

 

 

You say that experimentalists have tested for this effect. Please post a link. When the constancy of decay rates was tested, they did not account for solar/cosmic particles like muons that penetrate their neutron shield creating a neutron/radiation background from within the sample.

 

Fact:  The earth's magnetic field over the last 6000 years fluctuated up and down, but it was predominately significantly stronger than today

Fact:  The earth's magnetic field has a strong shielding effect on solar/cosmic radiation

Fact:  Solar and cosmic radiation cause the earth's surface background radiation

Fact:  Even the slightest decreases in just solar radiation (33 day cycle, 26 day cycle, January) have a slight accelerating effect on decay.

 

So I will repeat my point, it is fact that slight decreases in just solar radiation cause slight accelerations to decay. When the earths magnetic field was a lot stronger during the last 6000 years, there would have been a dramatic decrease in both solar and cosmic radiation and we can therefore expect a dramatic acceleration of decay rates. 

 

This is the reason I believe the recent past (~200AD and earlier) was radically different from this moment in time.

 

Are you unaware that decay rates have been tested under a variety of extreme conditions? Pressures, temperatures, and yes, radiation...

 

Are you aware that radioactive decay involves different types of physical processes, some of which will be affected in different ways by the conditions you mention? That would put a serious wrench in your supposition that decay rates of different isotopes would be affected in identical ways. If you look again at the numbers, you will see that the amount of change that would be needed to account for the shortfall you need are *enormous* and would have shown up in the experimental work. On the other hand, some of the variations that have been found, as in the link I provided you, are much too small to help your case. Variations in the earth's magnetic field, and accompanying fluctuations in cosmic ray radiation, I submit, will not suffice.

 

I will say this much. What I am arguing now depends on the physical rules of the universe to always have been precisely what they are now. Maybe that is what you should not give up at the get-go?

 

I will bow out now though, from this particular exchange with you. I will read whatever you have to say in response but leave it at that.

 

 

1) Yes I am aware that decay rates have been tested in many ways and have remained constant until the recent solar radiation oscillations have been detected. Have you any proof of shielding parent isotopes from the influence of specifically muons and if this has had any effect? Muons cause most of the background radiation on earth's surface and were relatively unknown during the period that the constancy of radioactive decay was continuously tested.

 

2) Yes I am aware that parent isotopes decay in various ways, and have already indicated in this thread that both alpha decay and beta decay have been affected by the "solar effect". You make the assumption that the various forms of decay would be affected by differing proportions, I see no such hint in the studies done so far, so you need further evidence for your supposition.

 

3) You conclude that under a dramatic reduction in radiation there will be an insignificant effect on decay rates. You make that assertion without providing evidence, even though I already illustrated the marble analogy to show how dramatic the effect can be on slowly decaying isotopes.

 

4) We are both basing our views on the physical laws of the universe being constant. Like I keep saying, the radiation/decay relationship is already observed fact. If small reductions in radiation cause small increases in decay, its only logical that large reductions in radiation cause large increases in decay. I feel you would be more scientific in your approach if you would admit to this logic, neither of us knowing the full extent of the effect on decay when radiation on the earth's surface was greatly reduced in the past.

Link to comment
Share on other sites


  • Group:  Royal Member
  • Followers:  11
  • Topic Count:  19
  • Topics Per Day:  0.01
  • Content Count:  3,396
  • Content Per Day:  0.90
  • Reputation:   730
  • Days Won:  4
  • Joined:  12/21/2013
  • Status:  Offline
  • Birthday:  12/26/1963

Well here you go:

 

Just a few minutes ago...this is what I find on YAHOO:  http://news.yahoo.com/blogs/sideshow/camels-in-the-bible-182042100.html

 

 

Headline:  BIG ERROR IN THE BIBLE

 

I'm OFF THE RESERVATION!!!!!!!!!!!!!!!!!!!!!!!!

 

Taking Camel Bones and RADIOMETRIC DATING and trying to caste doubt on GOD'S WORD!!!!!!

 

ARE YOU ABSOLUTELY !@%%^$@  KIDDING ME!!!!!!!!!

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...