Infinity Question

Is Raziaar stupid and hopeless?


  • Total voters
    48

Raziaar

I Hate Custom Titles
Joined
Sep 13, 2003
Messages
29,769
Reaction score
140
Okay well... after being called idiot and other names, blah blah blah, for my inability to grasp the concept of infinity when it comes to the infinite monkey theorem, I'm trying to wrap my head around another infinity thing, though quite a bit different. This isn't anything I've read, just something I was wondering about.

I always hear how infinity is well... infinite, and that ANYTHING is possible.

But my question is this. If there was a random number generator that generated numbers of either 1 or 2 for all of eternity, is it POSSIBLE that the number 1 will be the only number generated, infinitely, and that the number 2 will infinitely never be generated?

I just wonder about it... seeing how supposedly ANYTHING is possible. There's an infinite number of chances that the two would come up and negate the check, leading it to be false.

I can't wrap my head around it... I can't just sit here and say that the answer is yes simply because anything is possible in such a scenario.


Oh and don't be afraid on answering the poll honestly and making me feel bad. lol. It's entirely related to my grasping the math concept infinity and not anything else.
 
Actually, you can't get a probability for something infinitely generated, because the generating doesn't end.


But in maths terms, I think you could calculate something like it by using a equation that I can't type on the keyboard.
 
In the best of my mind I would say that both are possible - but very much unlikely.
 
Actually, you can't get a probability for something infinitely generated, because the generating doesn't end.

Oh really?

So my question is shot down by something as simple as that? :(
 
Oh really?

So my question is shot down by something as simple as that? :(

Well, I think you could calculate the concept, but it'd be meaningless because it'd prolly turn out to be infinitely close to zero, and therfore zero. :rolling:
 
Joke option?

I think like numbers said, I think if it's generating numbers for infinity then you can't say which it will or will not generate over eternity since it's still going. Always.
 
Joke option?

I think like numbers said, I think if it's generating numbers for infinity then you can't say which it will or will not generate over eternity since it's still going. Always.

I thought about adding a joke option... but then I sat and thought and realized I already had a joke option in place, "No".

I wish there was some way to separate out "will it get there" from "could it happen".

Like if you had some omnipotent power to see the result of what infinity would be.
 
In that case pretty much 0 chance. Now if you had an infinite number of random number generators working for infinity and you were omnipotent (cheat!) then I think it would be a certainty?
 
In that case pretty much 0 chance. Now if you had an infinite number of random number generators working for infinity and you were omnipotent (cheat!) then I think it would be a certainty?

Ohh, that's sort of what I meant, forgot to put that down that there are an infinite number of them attaining the same goal.
 
In probability the numbers in a countable set n will appear the same amount of times given enough time and given that every number has an equal chance of appearing. If that time is infinite then given a large enough result sample m you will find that each number appeared approximately equal times. If the numbers do not have an equal chance of appearing then the numbers with a higher probability will have a greater chance of appearing then the numbers with less probability.

With an infinite set on the other hand the same holds true given that the infinite set is countable and all numbers have the same probability p of appearing. On the other hand if you only see 2's and never see a 1 you could conclude that 1 has a probability of very close to 0 or 0 (You can't say either with certainty since the sample space is infinite) and 2 has a probability 100 or close to one hundred.

For any number k greater then 2 and any number l less then 2
k>2>l
the probability of k is less then the probability of 2 and the probability of l is less then the probability of 2
P(k)<P(2)>P(l)

The reason this is possible for an infinite set of numbers is because all decimal number between 0 and 1 (the range used to state probabilities) are infinite themselves. Therefore they can be mapped to an infinite set.
 
Assuming a non-zero probability of generating 2, and a binary (only 2 possible) set of outcomes of 1 or 2, the probability of only generating 2 goes to 0 at infinity. The equation would be p^n where p is probability of generating 1 (not generating a 2) and n is the number of trials (infinite in this case). The way you would write it is:
%5CLARGE%5C%21%5Clim_%7Bn%5Cto%5Cinfty%7D%5C%20p%5En.gif

Where you provide any p greater than or equal to 0 and less than 1, this limit can be evaluated to the answer of 0.

You can see the concept of this by entering any value greater than or equal to 0 and less than 1 in your calculator then multiplying it by itself and tapping the = key repeatedly. You will see that your number always gets smaller and closer to 0 the more you tap it. This means that if you tap the equals key forever, you get to 0. Of course in reality you can never press that key forever, just like you will never actually get to 0 (or at least not until your calculator rounds down once it reaches its maximum decimal size).

Sooo... long answer is no, it's not possible to only get 1 forever. This is actually the same question as the Shakespeare monkey theorem which also has the same answer (except that you asked this one in the negative case).

P.S. Kitfox is just vomiting up his mathematics at you.
 
You know what I always wondered? How does a computer decide on a random number? It's probably really simple, but I could never quite grasp how a machine could do something like that without some sort of tricky coding or something. For that matter, would using a computer be completely random, or is there inevitably going to be a point where it won't generate the same number again because of the way it's coded? This is probably stupid but whatever, I'm intrigued now.
 
You know what I always wondered? How does a computer decide on a random number? It's probably really simple, but I could never quite grasp how a machine could do something like that without some sort of tricky coding or something. For that matter, would using a computer be completely random, or is there inevitably going to be a point where it won't generate the same number again because of the way it's coded? This is probably stupid but whatever, I'm intrigued now.

No computer can generate a truly random number. For that matter our knowledge of physics would argue that such a thing does not exist, even in quantum positions, the universe has always been deterministic. This has some scary repercussions on the concept of free will. But I say that having your actions fixed and in accordance with the logic of your brain signals which is in accordance with the laws of physics is better tan having them governed by a kernel of true randomness.

That aside, weak programs simply use algorithms to pick a number using decimals of pi, or a set algorithm for each digit, or something based on the computer clock. The problem with this is that it isn't really random, and somebody can actually figure it out if they know the algorithm, or they guess at the clock time, so you wouldn't encrypt sensitive data using encryption keys generated that way. Stronger random generators like this one, http://www.random.org/, use unpredictable sources like atmospheric noise.
 
Dan, KitFox... thanks for the responses :D

Dan... how do you contemplate these things? You're always riding your bike! There's no time for mathematics on your bike!
 
Responding to topic, the chance of typing infinite 1 or 2 is zero. Because as it types 1 or 2 continuously, there is still infinite digits of numbers will be typed in the future. So no matter how many continuous 1 or 2 have been typed, there is certainly going to be more numbers being typed. The other number, 2 or 1, ought going to be typed in the future. After all, the machine keeps going and going.
 
You're basically talking about coin tosses so instead of 1's and 2's I'll say Heads or Tails.

P(H) = 1/2
P(HH) = 1/4
P(HHH) = 1/8
P(n number of H's) = 1/ (2^n)

So in general it becomes less and less likely that the coin is fair if you get a long sequence of either heads or tails in a row.

Now talking about infinity,

lim(1/ (2^n), 1, infinity) = 0. In layman's terms, the limit of that quantity from 1 to infinity approaches 0 which means that as you make more and more coin flips and they all come up heads the probability that the coin is still fair approaches 0.

So if you could actually flip it infinite amount of times and they did all come up heads then you would be forced to conclude that the coin was not fair.
 
You know what I always wondered? How does a computer decide on a random number? It's probably really simple, but I could never quite grasp how a machine could do something like that without some sort of tricky coding or something. For that matter, would using a computer be completely random, or is there inevitably going to be a point where it won't generate the same number again because of the way it's coded? This is probably stupid but whatever, I'm intrigued now.

What Dan said, but if you're interested in the source code of a RNG here's one in C# (not mine):

using System;
Code:
namespace SharpNeatLib.Maths
{
  /// <summary>
  /// A fast random number generator for .NET
  /// Colin Green, January 2005
  /// 
  /// September 4th 2005
  ///	 Added NextBytesUnsafe() - commented out by default.
  ///	 Fixed bug in Reinitialise() - y,z and w variables were not being reset.
  /// 
  /// Key points:
  ///  1) Based on a simple and fast xor-shift pseudo random number generator (RNG) specified in: 
  ///  Marsaglia, George. (2003). Xorshift RNGs.
  ///  http://www.jstatsoft.org/v08/i14/xorshift.pdf
  ///  
  ///  This particular implementation of xorshift has a period of 2^128-1. See the above paper to see
  ///  how this can be easily extened if you need a longer period. At the time of writing I could find no 
  ///  information on the period of System.Random for comparison.
  /// 
  ///  2) Faster than System.Random. Up to 8x faster, depending on which methods are called.
  /// 
  ///  3) Direct replacement for System.Random. This class implements all of the methods that System.Random 
  ///  does plus some additional methods. The like named methods are functionally equivalent.
  ///  
  ///  4) Allows fast re-initialisation with a seed, unlike System.Random which accepts a seed at construction
  ///  time which then executes a relatively expensive initialisation routine. This provides a vast speed improvement
  ///  if you need to reset the pseudo-random number sequence many times, e.g. if you want to re-generate the same
  ///  sequence many times. An alternative might be to cache random numbers in an array, but that approach is limited
  ///  by memory capacity and the fact that you may also want a large number of different sequences cached. Each sequence
  ///  can each be represented by a single seed value (int) when using FastRandom.
  ///  
  ///  Notes.
  ///  A further performance improvement can be obtained by declaring local variables as static, thus avoiding 
  ///  re-allocation of variables on each call. However care should be taken if multiple instances of
  ///  FastRandom are in use or if being used in a multi-threaded environment.
  /// 
  /// </summary>
  public class FastRandom
  {
    // The +1 ensures NextDouble doesn't generate 1.0
    const double REAL_UNIT_INT = 1.0 / ((double)int.MaxValue + 1.0);
    const double REAL_UNIT_UINT = 1.0 / ((double)uint.MaxValue + 1.0);
    const uint Y = 842502087, Z = 3579807591, W = 273326509;

    uint x, y, z, w;

    #region Constructors

    /// <summary>
    /// Initialises a new instance using time dependent seed.
    /// </summary>
    public FastRandom()
    {
      // Initialise using the system tick count.
      Reinitialise((int)Environment.TickCount);
    }

    /// <summary>
    /// Initialises a new instance using an int value as seed.
    /// This constructor signature is provided to maintain compatibility with
    /// System.Random
    /// </summary>
    public FastRandom(int seed)
    {
      Reinitialise(seed);
    }

    #endregion

    #region Public Methods [Reinitialisation]

    /// <summary>
    /// Reinitialises using an int value as a seed.
    /// </summary>
    /// <param name="seed"></param>
    public void Reinitialise(int seed)
    {
      // The only stipulation stated for the xorshift RNG is that at least one of
      // the seeds x,y,z,w is non-zero. We fulfill that requirement by only allowing
      // resetting of the x seed
      x = (uint)seed;
      y = Y;
      z = Z;
      w = W;
    }

    #endregion

    #region Public Methods [System.Random functionally equivalent methods]

    /// <summary>
    /// Generates a random int over the range 0 to int.MaxValue-1.
    /// MaxValue is not generated in order to remain functionally equivalent to System.Random.Next().
    /// This does slightly eat into some of the performance gain over System.Random, but not much.
    /// For better performance see:
    /// 
    /// Call NextInt() for an int over the range 0 to int.MaxValue.
    /// 
    /// Call NextUInt() and cast the result to an int to generate an int over the full Int32 value range
    /// including negative values. 
    /// </summary>
    /// <returns></returns>
    public int Next()
    {
      uint t = (x ^ (x << 11));
      x = y; y = z; z = w;
      w = (w ^ (w >> 19)) ^ (t ^ (t >> 8));

      // Handle the special case where the value int.MaxValue is generated. This is outside of 
      // the range of permitted values, so we therefore call Next() to try again.
      uint rtn = w & 0x7FFFFFFF;
      if (rtn == 0x7FFFFFFF)
        return Next();
      return (int)rtn;
    }

    /// <summary>
    /// Generates a random int over the range 0 to upperBound-1, and not including upperBound.
    /// </summary>
    /// <param name="upperBound"></param>
    /// <returns></returns>
    public int Next(int upperBound)
    {
      if (upperBound < 0)
        throw new ArgumentOutOfRangeException("upperBound", upperBound, "upperBound must be >=0");

      uint t = (x ^ (x << 11));
      x = y; y = z; z = w;

      // The explicit int cast before the first multiplication gives better performance.
      // See comments in NextDouble.
      return (int)((REAL_UNIT_INT * (int)(0x7FFFFFFF & (w = (w ^ (w >> 19)) ^ (t ^ (t >> 8))))) * upperBound);
    }

    /// <summary>
    /// Generates a random int over the range lowerBound to upperBound-1, and not including upperBound.
    /// upperBound must be >= lowerBound. lowerBound may be negative.
    /// </summary>
    /// <param name="lowerBound"></param>
    /// <param name="upperBound"></param>
    /// <returns></returns>
    public int Next(int lowerBound, int upperBound)
    {
      if (lowerBound > upperBound)
        throw new ArgumentOutOfRangeException("upperBound", upperBound, "upperBound must be >=lowerBound");

      uint t = (x ^ (x << 11));
      x = y; y = z; z = w;

      // The explicit int cast before the first multiplication gives better performance.
      // See comments in NextDouble.
      int range = upperBound - lowerBound;
      if (range < 0)
      {	// If range is <0 then an overflow has occured and must resort to using long integer arithmetic instead (slower).
        // We also must use all 32 bits of precision, instead of the normal 31, which again is slower.	
        return lowerBound + (int)((REAL_UNIT_UINT * (double)(w = (w ^ (w >> 19)) ^ (t ^ (t >> 8)))) * (double)((long)upperBound - (long)lowerBound));
      }

      // 31 bits of precision will suffice if range<=int.MaxValue. This allows us to cast to an int and gain
      // a little more performance.
      return lowerBound + (int)((REAL_UNIT_INT * (double)(int)(0x7FFFFFFF & (w = (w ^ (w >> 19)) ^ (t ^ (t >> 8))))) * (double)range);
    }

    /// <summary>
    /// Generates a random double. Values returned are from 0.0 up to but not including 1.0.
    /// </summary>
    /// <returns></returns>
    public double NextDouble()
    {
      uint t = (x ^ (x << 11));
      x = y; y = z; z = w;

      // Here we can gain a 2x speed improvement by generating a value that can be cast to 
      // an int instead of the more easily available uint. If we then explicitly cast to an 
      // int the compiler will then cast the int to a double to perform the multiplication, 
      // this final cast is a lot faster than casting from a uint to a double. The extra cast
      // to an int is very fast (the allocated bits remain the same) and so the overall effect 
      // of the extra cast is a significant performance improvement.
      //
      // Also note that the loss of one bit of precision is equivalent to what occurs within 
      // System.Random.
      return (REAL_UNIT_INT * (int)(0x7FFFFFFF & (w = (w ^ (w >> 19)) ^ (t ^ (t >> 8)))));
    }


    /// <summary>
    /// Fills the provided byte array with random bytes.
    /// This method is functionally equivalent to System.Random.NextBytes(). 
    /// </summary>
    /// <param name="buffer"></param>
    public void NextBytes(byte[] buffer)
    {
      // Fill up the bulk of the buffer in chunks of 4 bytes at a time.
      uint x = this.x, y = this.y, z = this.z, w = this.w;
      int i = 0;
      uint t;
      for (int bound = buffer.Length - 3; i < bound; )
      {
        // Generate 4 bytes. 
        // Increased performance is achieved by generating 4 random bytes per loop.
        // Also note that no mask needs to be applied to zero out the higher order bytes before
        // casting because the cast ignores thos bytes. Thanks to Stefan Trosch?tz for pointing this out.
        t = (x ^ (x << 11));
        x = y; y = z; z = w;
        w = (w ^ (w >> 19)) ^ (t ^ (t >> 8));

        buffer[i++] = (byte)w;
        buffer[i++] = (byte)(w >> 8);
        buffer[i++] = (byte)(w >> 16);
        buffer[i++] = (byte)(w >> 24);
      }

      // Fill up any remaining bytes in the buffer.
      if (i < buffer.Length)
      {
        // Generate 4 bytes.
        t = (x ^ (x << 11));
        x = y; y = z; z = w;
        w = (w ^ (w >> 19)) ^ (t ^ (t >> 8));

        buffer[i++] = (byte)w;
        if (i < buffer.Length)
        {
          buffer[i++] = (byte)(w >> 8);
          if (i < buffer.Length)
          {
            buffer[i++] = (byte)(w >> 16);
            if (i < buffer.Length)
            {
              buffer[i] = (byte)(w >> 24);
            }
          }
        }
      }
      this.x = x; this.y = y; this.z = z; this.w = w;
    }


    //		/// <summary>
    //		/// A version of NextBytes that uses a pointer to set 4 bytes of the byte buffer in one operation
    //		/// thus providing a nice speedup. The loop is also partially unrolled to allow out-of-order-execution,
    //		/// this results in about a x2 speedup on an AMD Athlon. Thus performance may vary wildly on different CPUs
    //		/// depending on the number of execution units available.
    //		/// 
    //		/// Another significant speedup is obtained by setting the 4 bytes by indexing pDWord (e.g. pDWord[i++]=w)
    //		/// instead of adjusting it dereferencing it (e.g. *pDWord++=w).
    //		/// 
    //		/// Note that this routine requires the unsafe compilation flag to be specified and so is commented out by default.
    //		/// </summary>
    //		/// <param name="buffer"></param>
    //		public unsafe void NextBytesUnsafe(byte[] buffer)
    //		{
    //			if(buffer.Length % 8 != 0)
    //				throw new ArgumentException("Buffer length must be divisible by 8", "buffer");
    //
    //			uint x=this.x, y=this.y, z=this.z, w=this.w;
    //			
    //			fixed(byte* pByte0 = buffer)
    //			{
    //				uint* pDWord = (uint*)pByte0;
    //				for(int i=0, len=buffer.Length>>2; i < len; i+=2) 
    //				{
    //					uint t=(x^(x<<11));
    //					x=y; y=z; z=w;
    //					pDWord[i] = w = (w^(w>>19))^(t^(t>>8));
    //
    //					t=(x^(x<<11));
    //					x=y; y=z; z=w;
    //					pDWord[i+1] = w = (w^(w>>19))^(t^(t>>8));
    //				}
    //			}
    //
    //			this.x=x; this.y=y; this.z=z; this.w=w;
    //		}

    #endregion

    #region Public Methods [Methods not present on System.Random]

    /// <summary>
    /// Generates a uint. Values returned are over the full range of a uint, 
    /// uint.MinValue to uint.MaxValue, inclusive.
    /// 
    /// This is the fastest method for generating a single random number because the underlying
    /// random number generator algorithm generates 32 random bits that can be cast directly to 
    /// a uint.
    /// </summary>
    /// <returns></returns>
    public uint NextUInt()
    {
      uint t = (x ^ (x << 11));
      x = y; y = z; z = w;
      return (w = (w ^ (w >> 19)) ^ (t ^ (t >> 8)));
    }

    /// <summary>
    /// Generates a random int over the range 0 to int.MaxValue, inclusive. 
    /// This method differs from Next() only in that the range is 0 to int.MaxValue
    /// and not 0 to int.MaxValue-1.
    /// 
    /// The slight difference in range means this method is slightly faster than Next()
    /// but is not functionally equivalent to System.Random.Next().
    /// </summary>
    /// <returns></returns>
    public int NextInt()
    {
      uint t = (x ^ (x << 11));
      x = y; y = z; z = w;
      return (int)(0x7FFFFFFF & (w = (w ^ (w >> 19)) ^ (t ^ (t >> 8))));
    }


    // Buffer 32 bits in bitBuffer, return 1 at a time, keep track of how many have been returned
    // with bitBufferIdx.
    uint bitBuffer;
    uint bitMask = 1;

    /// <summary>
    /// Generates a single random bit.
    /// This method's performance is improved by generating 32 bits in one operation and storing them
    /// ready for future calls.
    /// </summary>
    /// <returns></returns>
    public bool NextBool()
    {
      if (bitMask == 1)
      {
        // Generate 32 more bits.
        uint t = (x ^ (x << 11));
        x = y; y = z; z = w;
        bitBuffer = w = (w ^ (w >> 19)) ^ (t ^ (t >> 8));

        // Reset the bitMask that tells us which bit to read next.
        bitMask = 0x80000000;
        return (bitBuffer & bitMask) == 0;
      }

      return (bitBuffer & (bitMask >>= 1)) == 0;
    }

    #endregion
  }
}
 
I voted yes.

But i think im the hopeless one really
 
No computer can generate a truly random number. For that matter our knowledge of physics would argue that such a thing does not exist, even in quantum positions, the universe has always been deterministic.

In a macroscopic sense, determinism is true. However, a lot of things in actual physics is pretty random, such as tunneling effect or the decay of a single radioactive nucleus. Even if the experiment condition is ideally the same, the results of certain quantum effect coming out are different every time. There is no way one can predict the result, but only approximated by probability. Intrinsic randomness dominated the world, let alone the account for the Heisenberg uncertainty principle.

One of these experiments is generation of interference pattern by electron scattering. There is no way one can know which place the electron is going to be bombarded. Only we have sent sufficiently large amount of electrons, the interference fringes can be predicted.

If we enlarge the probability from microscopic scale, much like butterfly effect, free will can be explained; probably not exactly scientifically, but certainly philosophically.
 
Looking at microscopic states from a macroscopic point of view, it certainly seems random, and it is complex enough that we can only predict general trends on a scale several orders of magnitude larger than the events themselves, but cause an effect still apply to particles, and future states of particular particles depend on past states of the system. If one had all of the information about all of the particles, and all of the governing relationships, you could predict the future of every particle, and thus the future of any systems derived from those particles and their states. Of course we don't have the capacity to learn, store, manage, or comprehend that amount of data, but the principle is there that everything is determined. Free will is just a macroscopic demonstration of a system that is too complex to predict. The mind is like a leverage mechanism for very small things to influence and exert actions on a very large scale in a coherent and meaningful way.
 
Really?

Show me, with all ideal ways of measurement, how do you predict when will a single plutonium-238 nucleus, generated freshly from beta-decayed of Neptunium-238, goes decay, given its half-life 88 years?
 
/ignoring responses to thread

You've misinterpreted the "anything is possible" argument I was making.

I meant that all possible results will occur.

The keyword that answers this question though is "will". For infinity, the numbers 1 and 2 will occur infinite times.
 
Really?

Show me, with all ideal ways of measurement, how do you predict when will a single plutonium-238 nucleus, generated freshly from beta-decayed of Neptunium-238, goes decay, given its half-life 88 years?

As I said, we don't have the means for that kind of measurement. But theoretically if you had the data about the surrounding atoms, and the location and position of the electrons, you can use the strong and weak nuclear forces to model all of the particle interactions in that particle, and in the surrounding particles, and figure out at what point for a particular particle that the energy level threshold is crossed where the isotope collapses.

Just because we can't determine it, does not mean that it is random in absolute terms, just random relative to our available information.
The same way that a horse race is completely random to spectators just looking at the names, but to someone who knows the horses or may have fixed the race, the probabilities are different, or the result is certain. The actual result itself is perfectly determinant, probabilities only exist to people who don't have complete information or accurate models. For an idiot who has no memory and no concept of his surrounding, every moment would seem to be a random input of sensory data with no logic or relationships.
 
Correct me if I'm wrong, but...
...location and position of the electrons...
Isn't knowledge of that forbidden by the physical laws? There is no practical, but more importantly no theoretical way of knowing the information required to infer future states. Almost-truly random?
 
The concept of infinity is imaginary. Scientists now believe even the
Universe is not quite infinite, it only seems infinite to our puny eyes.
An example of Imaginary numbers would be the square root of any
negative number, and as such, it can only be described symbolically.
So, to answer the original topic, imagine the formula infinity to the infinite power:
^

It is only theoretical because it cannot be proven.
 
That aside, weak programs simply use algorithms to pick a number using decimals of pi, or a set algorithm for each digit, or something based on the computer clock. The problem with this is that it isn't really random, and somebody can actually figure it out if they know the algorithm, or they guess at the clock time, so you wouldn't encrypt sensitive data using encryption keys generated that way.
So what you're saying is, I can win epics on WoW if I screw with my computer clock?

Seriously though, thanks for the answer. :D
 
/ignoring responses to thread

You've misinterpreted the "anything is possible" argument I was making.

I meant that all possible results will occur.

The keyword that answers this question though is "will". For infinity, the numbers 1 and 2 will occur infinite times.

Uhhh, I have no idea what you're talking about. I didn't make this thread because of you!


Oh and I'm finding the discussion in this thread very interesting.
 
As I said, we don't have the means for that kind of measurement. But theoretically if you had the data about the surrounding atoms, and the location and position of the electrons, you can use the strong and weak nuclear forces to model all of the particle interactions in that particle, and in the surrounding particles, and figure out at what point for a particular particle that the energy level threshold is crossed where the isotope collapses.

Spontaneous nuclear decay is not affected by external factors: anything, including air pressure, temperature and the surround species of subatomic particles. Nuclear decay is not affected by any interaction with the surroundings. The nucleus simply decay whenever it likes to. At the time of half-life, there is 50% for the nucleus to go decayed; but there is always no predicting whether the nucleus will decay or not, no matter how ideal your measurement is.

In a macroscopic point of view, or in statistical physics sense, your second paragraph is very true. Yet, when we come to the size of an atom, substances no longer appear to be a solid body or discrete particles. i.e, when a particle collide into a wall head-on, it is not necessary for the particle to bounce back as predicted in classical physics. Rather, quantum objects are a mist of uncertainty dominated by probability, mostly involved in wave particle duality.

Electron diffraction is another good example. If the path of an electron is so deterministic, the electrons emitted by the same pattern will definitely land on the same spot. In actuality, a diffraction pattern of bright and dark fringes is generated. This means the path of electron is dominated by a series of probable paths and improbable paths. That is the expressionism of randomness.
 
You know what I always wondered? How does a computer decide on a random number?
http://en.wikipedia.org/wiki/Pseudorandom_number_generator

So what you're saying is, I can win epics on WoW if I screw with my computer clock?
http://www.cigital.com/papers/download/developer_gambling.php

That is a link to a bunch of guys who found a way to cheat in online poker by exploiting its use of a random number generator. Scroll down to "How Pseudo-Random Number Generators Work" for the most interesting bit.

If one had all of the information about all of the particles, and all of the governing relationships, you could predict the future of every particle, and thus the future of any systems derived from those particles and their states.
There is no evidence, at the moment anyway, to suggest a deterministic universe "underneath" quantum mechanics.
 
I was under the impression that a quantum physics universe involved true randomness of electrons.
 
I don't believe that probability clouds of electron position are truly random, they simply come out random relative to our ability to measure and quantify the particular forces at work on that scale. Each electron in the universe is influencing every other electron in the universe to some degree as well as innumerable other factors. These background noise factors are what give electrons their precise location and momentum. On that note, the Heisenberg Uncertainty Principle doesn't state that it is impossible to know the position and momentum of a particular electron, only that it is impossible to measure both, because in taking a measurement, the probe will affect the value being measured. Now because of entanglement, you would essentially have to know the sum knowledge of the entire universe to be able to precisely predict the exact actions of a single electron. For our purposes we can say this is "impossible" to do, but the fact remains that it is still determined by the past.
 
Still doesn't quite explain how a particle can appear magically behind a barrier through quantum tunneling. I think.
 

So even the top physicists are debating over the issue. It seems that we may have no way to convince each other here with our limited knowledge. Without ample new experiment result, the debate over certainty and uncertainty will only become a philosophical one, not scientific.

:p

Newscientist said:
The debate over whether the universe is random or deterministic is not likely to end before such experiments become possible. That won't stop physicists and philosophers from continuing to examine whether or not the logical structure of quantum theory demands randomness, or might instead rest on some deeper deterministic layer.

/thread
 
I was the one who brought up the infinite time and monkeys argument in the thread I think Raz. You got shot down pretty quick so I didn't reply because everything had already been said.

Something else I wanted to say though, there are other similar interesting examples. Walking through a wall is possible, just extremely unlikely. Just as is a concrete statue waving at you. Just different examples of the monkeys writing shakespeare. All the atoms in a statue could move one way then the other in an extremely unlikely scenario. Its not impossible whatsoever, but if you want to know how unlikely it is...Richard Dawkins says "If you had started writing zeros since the start of the universe and were still going now, you still wouldn't have written near enough".
 
Assuming a non-zero probability of generating 2, and a binary (only 2 possible) set of outcomes of 1 or 2, the probability of only generating 2 goes to 0 at infinity. The equation would be p^n where p is probability of generating 1 (not generating a 2) and n is the number of trials (infinite in this case). The way you would write it is:
%5CLARGE%5C%21%5Clim_%7Bn%5Cto%5Cinfty%7D%5C%20p%5En.gif

Where you provide any p greater than or equal to 0 and less than 1, this limit can be evaluated to the answer of 0.

You can see the concept of this by entering any value greater than or equal to 0 and less than 1 in your calculator then multiplying it by itself and tapping the = key repeatedly. You will see that your number always gets smaller and closer to 0 the more you tap it. This means that if you tap the equals key forever, you get to 0. Of course in reality you can never press that key forever, just like you will never actually get to 0 (or at least not until your calculator rounds down once it reaches its maximum decimal size).

Sooo... long answer is no, it's not possible to only get 1 forever. This is actually the same question as the Shakespeare monkey theorem which also has the same answer (except that you asked this one in the negative case).

P.S. Kitfox is just vomiting up his mathematics at you.

Sir are you questioning my integrity?

(i have none)
 
This topic has gone way over my head.
 
An interesting problem involving infinity, probability and randomness:

Suppose a casino is holding a gambling game that goes as follows:

The house has one dollar in the pot. The house flips a coin, and if the coin is a tails, you walk away with nothing. However, if the house flips a heads, they give you the amount of money in the pot, double the money currently in the pot, and flip again. They will continue this process until they flip a tails... at which point they will give you the money in the pot.

Question: How much money should you be willing to spend to play this game?

Most people wouldn't bet much more than a dollar or two, because the probability of getting a large sum of money is so small.

However, mathematically, you should be willing to bet quite a lot. In fact, the expected payoff of this game is infinite.

The expected value of the amount of money you win is:
E(X)=Sum(n=0 to infinity of 2^n(1/2^n))=Sum(n=0 to infinity of 1)
When this is taken to infinity, it becomes infinite.

You see, as the probability of winning approaches zero, your expected winnings approach infinity.

Infinite values tend to royally screw with probability and expected value.
 
Back
Top