The all-time machine

Dingo

Temporal Novice
This is an idea of a machine that, given all the information about the world and a logical progression system, would accurately be able to predict all events that will ever happen. This is because as you move up in the level of information gathered, the window of error becomes less and less. Even quantum fluctuations could be mapped with enough information.

The major problem is that said machine can't account for three things

1. the power it uses (it uses more energy to compute the energy already used, and so on)

2. the results it gives (with every result, the present would be altered, thus nullifying all readings.)
 
There is another problem besides the ones you state, and it has to do with information...

This is because as you move up in the level of information gathered, the window of error becomes less and less. Even quantum fluctuations could be mapped with enough information.

This is not generally true. The thing that makes it untrue is what is known as Heisenberg Uncertainty, which says you cannot have accurate information for all measurements. It specifically states that if you measure a particle's momentum with a certain degree of accuracy, the accuracy with which you know its position decreases. From what we know of quantum theory at the current time, you would not even be able to measure enough information to be able to map, with certainty, even a single atom...much less everything else in the universe.

RMT
 
Keyword there is at this current time. As time progresses, so does our understandings of the world we live in. Also, considering that we have a mathematical measure to accurately find a particles momentum (x", or dxx/ddt) and its momentum (double primitive of momentum, finding both C's), that aspectr seems less likely o be a problem
 
Interesting post Dingo. Your talking about something that I hope will happen someday. I think it would usher in a new era of honesty and decency.


1, I'm not sure how you arrive at your power estimate. Power consumption per MIPS goes down over time, not up. So eventually we'll have the power.

2. Yep, that's a problem. The first thing I'd do is look at myself a minute from now, and then try to do something different from what I saw.

So the HUP is very interesting, I find it very counterintuitive, it bugs me. Maybe when we can find something that dark matter interacts with, we can use what we learn from that to measure normal matter to perfect accuracy? The HUP is supposed to be independent of the observer effect but I havn't heard much about how they've shown this to be true.
 
Keyword there is at this current time. As time progresses, so does our understandings of the world we live in.

Touche, but this assumption tends to ignore the Second Law of Thermodynamics. As time progresses the nature of energy changes (entropy increases). It is my belief that we are fooling ourselves when we say our "understandings" increase. What I believe most people mean is that our LINEAR understandings increase (i.e. how one or more causes lead to one or more effects). But energy is a "less than zero sum game", so whatever increase in understanding we achieve through a linear-reductionist model, we are merely ignorant (either purposefully or not) of the greatly increased uncertainty with respect to higher order (higher derivative) effects. This is why I say I think we are fooling ourselves.

Also, considering that we have a mathematical measure to accurately find a particles momentum (x", or dxx/ddt) and its momentum (double primitive of momentum, finding both C's), that aspectr seems less likely o be a problem

I don't understand. Was there a typo there? "momentum and its momentum"? As far as I am aware, no one has offered-up a closed-form solution for the classic N-body problem. As I recall, it seems to get awfully ugly, awfully quick once you get beyond 3 bodies. /ttiforum/images/graemlins/smile.gif

I believe once you study Taylor's series "approximations" and how they relate to integral and differential math, you come to realize that the universe has infinite states. Any attempt to decompose, and therefore quantify any number of those states, by necessity, leaves out other higher order states. To me it becomes a bit like traveling at the speed of light and the mass-energy conundrum wrapped therein.

While I believe we are approaching a time (no pun intended) where we will achieve a fully-integrated knowledge of Massive SpaceTime (which may "solve" the HUP), I do not believe that you can decompose such an "integrated" knowledge into component parts and make it useful without incurring the "debt" of uncertainty. In this manner, I believe such integrated knowledge can and will only be useful in a trans-physical (or meta-physical) realm. It is the knowledge that elevates us to our level of spirit, as a fully-integrated energy entity. Undivided... but also unknowable in the classical, meat-puppet sense.


RMT
 
Any attempt to decompose, and therefore quantify any number of those states, by necessity, leaves out other higher order states.

So an "all-time machine" won't work but my question would then be, could people's quantification efforts get "close enough" to make something cool?

http://en.wikipedia.org/wiki/Minimum_description_length

"Given the probability P(S) of a string S occurring, the optimal codelength for S is -Log2(P(S)) bits."

We can't compress data as small as we'd like, but what limited compression we can do, does help in some cases.

But I can't ignore data from /dev/random ends up being compressed to 1:1. That is the best definition of random data there is I think. The closer you get to 1:1 compression, the more "random" it is.
 
So an "all-time machine" won't work but my question would then be, could people's quantification efforts get "close enough" to make something cool?

Most certainly!!! But the caveat is that one must be willing to accept that there can (and likely will) occur some events in the "simulation" that do not occur in the "real thing". "Glitch in The Matrix" sort of thing. However, it may (and does) manifest as MUCH more than a "glitch"... so much so that it may cause an analyst to question the foundations of "reality".

This happens all the time in aerodynamic simulations, and is a good example. The usefulness of a linear aerodynamic model (in which only the first order aero derivatives are used to model airplane behavior) is very good...as long as the airplane is flying under flight conditions where the linear assumption remains valid. HOWEVER, if you were to try and compare the REAL airplane's dynamics to the simulation's dynamics once the flight condition parameters depart from the range of validity of the linear aerodynamic model, the resulting differences in airplane motions between the simulator and the real airplane can be HUGELY different.

So it is still a "cool model", but its representation of "reality" is highly questionable when you stray outside the realm of approximation. That's all I'm sayin! :D

RMT
 
Back
Top