Temporal Encoding/Decoding Mechanism

Sounds like it’d be easier to translate it from a binary data structure to a trinary data structure.

1 Like

Interesting… will check later… :+1:

1 Like

So I didn’t include lock-key and prediction mechanism in the video example… I think those things can be discussed and incorporated later once we’re sure whether it’s even possible… and It didn’t turn out how i thought previously but i had very little time… however this should give you some idea what i’m trying to do…

Note: Last structure (in this case rubik’s combination) does not store last timestamp and last bit data entered… they’re separately stored and without them, cube’s combination has no meaning…

1 Like

ohhhh, now I get it. You want a bijective function from the type N x N <-> N (Edit: or maybe between two dimensional, countable sets and one uncountable set). Here I might be able to dig up some strong proof whether it is possible or impossible to do what you want.

Edit: I just realize you got me a real challenge here. This is not going to be easy.

stuff for me to look up (expanding list):

  • Hilbert curve
  • parametric curves
  • bijective functions between different dimensions
  • reversible logic
1 Like

Cool… I’m excited even if it’s not possible… Thanks… :+1:

Great… Looks like you figured out what I mean here… If it has some substance or worthy of conjecture then may be can ask more people here… I have few more examples i’ll add later.

I think, I have a proof that it is not possible with finite (in the mathematical sense) amount of storage, which would rule out all known (discrete) storage mechanisms. It leaves open some quantum physics options but the no cloning theorem would limit how useful that is (apart from all other practical hurdles).

I’ll show the proof someone who is better with maths than I am, to find possible flaws.

1 Like

I found this from your keyword reversible logic.

somehow I didn’t find it myself before… I’m never sure what to look for… I have interests in computer science but I never studies it. Probably that’s why. :thinking:

I usually get way ahead of myself and think in sorts of abstractions… last year I even conceptualized a completely new kind of processor architecture around time encoding… also conceptualized semantics and predictive algorithms to verify informations do validate informations by reverse computation. that’s how i came to storing in binary because it would seem easier to predict and every few iterations, we can verify correct bias in predicted information with help of semantics…

Anyway. I have a suggestion… Don’t think it as storing information… may be smallest possible information to predict or decode. Think it like compression encoding… I’m trying to get around it by predicting and validating with Semantics… Predictive algorithms are the subsets of intelligence i was talking in previous thread… I think the key here is using time to encode and that gives extra variable to help predict…

I’ll have much better analogy in next reply…


I am not really trying to think of it as storing information, I just named it that way because that is what it essentially does. I am trying to find which mathematical entity fully describes your model ( so your example should be a subset of it ). I am using sets, tuples, bodys functions … . This is the only way to construct a proof without doing everything from the ground up.

The problem of the Rubiks cube example is that it is essentially a surjective function, while encoding to a finite space you either loose information or have to store more significant digits to compensate. You would need a bijection to be able to unambigously reconstruct the former state of your set. Unfortunately bijections are not possible for finite sets of different size.

I am sorry, it is really hard for me to translate to english and to make the mathematical concept understandable. It must be hard to parse what I mean. I’ll try to think of an example to illustrate.


No worries… Take your time, you’re already great help… I’ll try to figure out myself too… I pick up on your keywords… You can give me directions to look for…

Also deck of card or chess moves have very large combinations… They don’t work for our purpose?

1 Like

In the meantime, maybe this can help to illustrate:

In the first image of this wikipedia page you see the two sets X and Y. Think of the numbers in X as bits of information that get encoded into the letters in Y. Now you add bits to store in X that get encoded into Y. This is possible, however you now have more bits that “point” to the same finite set Y, like 3 and 4 that both point to C. There is no unique way to 3 or 4, the order is lost. Like with an inverse function that can spit out multiple crossings of the x-axis.

Or to think of your universe example from further up the thread:

Let’s say you are at the train station at 12 o clock and I meet you there and observe that you are there and who else is there. This is one state at one instant in time. From this information allone I cannot reconstruct how you got to the train station. Did you arrive by train, by car, by foot or by bicycle? I do not know, the state itself does not contain that information any more. Let’s say I manipulate further events and thereby “encode” information by choosing the bicycle over going by foot for my way home. The fact that I am at home afterwards, again does not contain the information how I got there. I would have to remember that fact and therfore store the information elsewhere.

Unfortunately any amount short of infinity is not enough. Even countable infinite is not enough.


Thoughts that came up while reading this:
Do cellular automata or conways game of live help with this?
(Conwaiys game of live because it is an instructive set of rules for a 2d cellular automaton)

Going back to the original question, you ask if you can store data in a limited phyiscal object by observing/tracking its (iterative, because discrete, because computers) evolution over time.

You require limited space and ask for potential unlimited data capacity.

Lets make things simple. A 1D cellular automaton with 4 cells:
| 0 | 0 | 0 | 0 |

This is one possible initial state. It will always have 4 cells, so it does not grow phyiscally.
Now how can we use it to represent information storage?
Lets assume each cell contains one bit (because computers) - although we could just use one byte, float or heck, even one harddisk full of digits for each cell, doesnt matter.
The thing is: it can only store 4*cellcontent data per time slot.
Here a possible evolution over time:
| 1 | 0 | 0 | 0 | t = 0
| 0 | 1 | 0 | 0 | t = 1
| 0 | 0 | 1 | 0 | t = 2
| 0 | 0 | 0 | 1 | t = 3
| 0 | 0 | 1 | 0 | t = 4

So I think this can be a good model for your problem. Not sure about arguments for or against your hypothesis though.
My guess is: you can probably do it, if you have infinite time to compute the desired state out of the current state.
Possibly you would even need an too-large-to-be-realistic (infinite?) set of rules for the evolution of these cells to encode unlimited data in there.
Finally you would probably end up in a “resolution” problem (not sure what to name it), where one state could have two not identical previous states that contradict each other.
I think this is where you loose information.
edit: this resolution issue could be a good example for the surjectivity you people were talking about above.

You could try to manage this exploding complexity problem by using a continuous model, instead of a discrete one (you were talking about quantum physics and superposition in fiber opticls - of that kind) - but I guess there you’ll be limited at one point by either the heisenbergsche uncertainty principle or some lower bound given by physics on the planck scale.



You could bypass the discretization problem of the planck limit by using the uncollapsed waveform. I am not sure whether the non cloning theorem limits the use of this but you could use statistics (quantum simulation).

The bottom line is, any descrete computing system will not work for this. I am not good enough with quantum physics to explore in that direction but I’ll ask someone who works in that field.

Btw. the automata analogy is also a good one. It’s hard to come up with analogies that do not set too many constraints.

1 Like

I think questions like this is mostly answered by statistics.
If you have a unstable data set you use statistics to get a trend for the data set.
If a data set is able to morph, and unless you are capably able to predict the morphing, all you have to go by is the behavior of the data set.
take your constant spin + fluctuation as a example.
The data would say it is a circle, but the “mean” circumference would differ from you constant spin, due to the bumps.

to make this statement come true you’d have to invent a time machine, basically all data takes up physical space in one way or another, there is a reason we dont have 800 quadrillion quasibytes of harddrives for our games, and basically it boils down to the law of nature of being harsh mistress, every logical bit you write takes up physical space, unless ofc as you suggest i can write that byte in the year 1685, on my current 2018 harddrive, which simply isn’t the way nature works with our current set of laws.
Todo this we’d need some kind of molecule we can control, which transcends time, as a non linear function.

1 Like

Great… I was aware of information entropy…
That’s why we need second time to track the system. But i did not want added complication of second time…

T1 = accelerates the time on state change…
T2 = keeps track of actual timeline…

So at the last structure… You will have 2 timestamps…
Think of T1 as relative time… It contrats like spacetime contracts with influence of gravity… where as T2 remains the same like in example…

I’ll draw some examples but I need time… I more of visual guy… I just know basic maths… I still think mostly it doesn’t work but we can try disproving it…

Now somehow these 2 timelines is incorporated in the encoding of information which I’m not yet sure how… But I’m hoping this would eliminate exploding problem… I’m really trying to get around information entropy… if it’s even possible… :wink:

That was my assumption based on P=NP to be true… because that would mean that all probability space already exist and you are using some clever mechanism to access those probability space by simulating… I hope it makes sense…

Ohh sorry… It’s not a bump… circumference is same only the pendulum angle changes slightly…

again hardware manufaturers got 3D aka. XYZ down to an art, but storing data in a 4th dimension aka. time is not so simple.
All were doing right now is using helium instead of oxygen etc. to store data since helium has different physical properties then oxygen.
The whole P=NP is a attempt to work around the difficulty of calculating algorithm X. Basically unless we rewrite our whole way of handling software, and/or change how we store data through out time and space, then something as simple as BigO(complexity of implementations) notation will neglect the whole concept since it simply dictates a while, for, what ever loop takes up x compute time, and a cpu only has x cpu time.
with quantum computers things may change abit, but dont shoot for the stars just yet, since what you’re suggesting is either infinite compression, or storage across time, either way infinite <- which is a very large value, and is not computeable.



Ok my apologies… i really didn’t mean as infinite storage literally. I don’t expect will have magical powers and wonderland :joy:. We always need some storage, even in my example above you need to store structure, last timestamp and last bit of information… you could probably extend it even to store snapshots every after certain number of steps to to avoid information entropy…

Actually I just remembered that I was thinking of time machine last year, it’s coming to now… But my example was in reverse…

So imagine on 1 august you wrote a number on a piece of paper… But you lost that piece of paper… If you had time machine then or at least time projector then you could go back to that time and place see what was written on the paper… But we don’t have time machine so I’m trying to figure out if encoded that information in time structure and I can just reverse it… Wishful thinking…

But there might be a tiny tiny bit of possibility with second time… We don’t have 2 times in reality. but who’s stopping us to use it in our algorithm? Potentially making a virtual molecule which we can control with 2 dimensions of time :thinking:

Just noticed… You are not linking time to structure instead you’re just itrating… The structure is dependent on timestamp parameter… you can consider it as structure hash and without it next structure evolves different. however it would still reach information entropy probably. i’ll make better example with 2 times next time.

Sorry i keep editing and adding in this…

Let’s think this way… If we change the purpose of this discussion a bit… It’s not about storage but compression… Consider some edge cases like quantum computation… Achieving amazing level compression by trading off computation… Let’s say this compression only works on 10k qubit QC…

If we are able to compute huge amount of data using time encoding mechanism within boundary of information entropy and then take a snapshot and create another structure and keep taking snapshots. and some error correction methods, semantics verification, predictive and other clever algorithms since we’re blessed with huge computational resource…

Maybe we’ll get closer to amazing alternative storage technology which doesn’t rely on state of art helium based hardware technology… Still It’s no way an unlimited storage but damn good compression in virtual structures…
That’s actually where expectation is…

This is a good example…
information on how I came to be at station at 12 o clock is lost to external variables and not visible when you observing me later. However I don’t think information is ever lost and you could still probably track if you had end states of all the variables I interacted with that morning, probably then there’s a chance that you could reconstruct a simulation to predict. But that’s just too many variables in this example. I don’t blame you because you were coming of my example of whole universe. Lol

But It also greatly depends on who’s is trying to figure out my past states from present state? This is where your semantics come as observer/investigator model… Let’s say it is Sherlock Holmes, and he also happens to know my behaviours and routines could probably deduct some information in some sense. But Sherlock had predisposed knowledge and great intelligence… Predisposed knowledge here is semantics/predictive models and his intelligence is just computation I was talking about…

To make it truly justify operationally, we need to make it a closed system to account for all information and nothing is lost outside… That’s why I chose rubik’s cube however it didn’t really justify what I was trying to do… None of the examples actually justify because they’re all incomplete a sense and just used as analogies… Like you said how it’s difficult for you to explain mathematical concepts in words, similarly I’m not yet sure how I translate complex visual simulation into words… We’re getting lost in translation here… But common ground is, I’m not trying to cheat physics here, just trying to understand if there’s a way around it without violating the laws…

So let’s try to make a closed system,
it’s a school with 3 rooms dedicated to physics, maths, computer science and I’ll throw a canteen there. School has 12 student names A to L… In the context of simplicity let’s say the only thing ever happens in this school are eat, study and sleep, students stay at school only and spend multiple hours in each room each day.

Now let’s make it interesting…Each student has wear two clocks… and also each room has it’s own clock… The laws of physics are weird around here so each class time dilation is different…
Student’s clock1 dilation is static but clock2 it’s influenced with physics class clock time when they are in physics room and so on.

At the end of 1 month… Each student has same time on clock1 and different times on clock2… Now at the end of the month there are two types the observers/investigator, parents and teacher… Parent’s don’t know how the school function and time dilation in each class, however they are quite aware of their kids behaviours… Second investigator is the teacher (who was on vacation during this 1 month) knows how school works and however little to no knowledge of their behaviours…

Question is… Can you figure out howmuch time each student spent in each rooms by looking at their clocks? Investigators independently will have hard time due to limitation in semantics however they’ll fare better together because parent’s know howmuch their kids sleep, number of times times they eat and what not… It’s not important to have keep track of everything precisely but just enough to compensate with error correction algorithms… I’m trying to encode information which is

I want to keep it exciting but not come out as arrogant… This is how we learn things right? create scenarios and simulations to learn concepts and limitations of reality because that’s how you can think up solutions to get around limitations…

And I think our brain might be utilizing something similar to encode information in structures.. There are theories suggesting that our brain never loses informations, we just lose the ability to recall. Such complication may be easier explained through this rather than thinking in physical storage sense given the limited size of our brain and tons of other things it has to house in it.. Sorry I am again jumping the gun here.. Just a hypothesis.

Sorry for long post… That’s why I make visual examples because writing gets foresty…

Not sure on any technical level…
But how does dna base pares store so much information?

1 Like

It is not lost to external variables, it is lost because the system cannot contain more information.

Now you are increasing the number of states (the size of the system), so naturally it can store more information.

allright, … where this analogy breaks down is that you are not overwriting variables or really try to store more information than can be contained by the system (like you do with the Rubiks cube example). For a good analogy you need a closed system where you know how much information can be stored by counting the number of possible states and then try to show that you can pack in more information. Everything else is true in your example but this one premise is wrong and therefore the conclusion is.

The more complex you make an example system the harder it is to find the flaws in it. (This is of course partly my fault because of my train station example) From a maths standpoint there is no reason that a simple system cannot do what a complex system can as long as you apply the same constraints. Change the constraints, not the complexity of the system.

DNA contains information in its base pair sequence:

There are two purine-pyrimidine base pairs in use in DNA, found in nature, Adenine-Thymine, Guanine-Cytosine (most RNA uses uracil-adenine instead). Each base along a single strand can more or less take any position. So in a sense DNA uses a base 4 number system, cool right? In practice there is a lot of redundancy and sequences of the same base-pair orientation make the DNA strands less stable, so there is an asterisk attached :wink: .

It is really amazingly dense information storage. Also, duplication is very easy, editing less so.


Sorry, confusion is I used words unlimited storage earlier sounded like absolute terms and compression which points to things what you’re rightfully explaining with maths. However I think that too is possible to get around but memory is recreated not stored…

Also I am using analogies which seems like mathematical problems which they are to some extent, but they’re more complex in logics… as I told you, I am trying to get around by using semantics and predictions and other clever technics which isn’t just mathematical problems…

In this case, to investigator information is lost… It doesn’t matter where the information is stored however you could use probabilistic reasoning and predisposed models to predict. I guess fundamentally it’s more to with reconstructing information with reasonable accuracy…

To some extent movies formats such as h.265 do that but it’s not an intelligent algorithm…

Here’s what I’ll do… I’ll take time and come up with example which will probably put us all on reasonably similar turfs in terms problem solving… :grinning:

I’m thinking that storing in time is fundamentally different from storing in space and everything we’re discussing here is somewhat limited by our spatial intuition… Easiest example would be to think something in future… It doesn’t exist in space and never happened before. there’s no promise that what you’re thinking is still going to happen… However it exist in probabilities… It’s about calculating the probabilities to access that pocket where you thought is true… You can think something of high probability and make it true by working towards it. What you’re essentially doing is computing to make that real.

1 Like