Thursday, May 9, 2019

Using no entanglement to communicate faster than light




In some sense, this is a climb-down. It follows because entanglement is not necessary! It requires nothing fancy to do better than 1 bit@liight-speed! If you capture an image in a telescope, you get lots of bit at once! If you focus at a spot and take the image MANY TIMES, THE SPEED OF INFORMATION CAN EXCEED LIGHT SPEED. If image is of bits, not any image, you are communicating faster than light. That is what is happening in entanglement write-up – entanglement makes it harder to see, and indeed the method is a different way to communicate. But not faster in any way.

Assuming entangle information takes zero time, imagine groups A and B distant by 4 light years. You can get the state of sending particles at once but not even fraction of a bit before 4 years. The pattern makes only sense after applying that image.

Once started, pipelining is possible.

So is earlier methods good for anything? Automatic delay-proof encryption! 

Does it mean that faster communication is dead? My work so far. I have no idea of how to do it, if at all, without sacrifice of entanglement proportional to the length of message but doing so attacks a theorem that it can’t be done. That is certainly Nobel Prize work i.e. highly unlikely to work except at high probability, which gets around the no-go theorem.

Monday, May 6, 2019

Using entanglement to communicate faster than light - image and sending


latest link

What does send mean with entanglement? Proper question after agreeing 1 bit faster than light send.

How is an image get sent so that all its bits are transmitted in parallel? Let us assume that many images are transported to communicate each other fast by pipeline. At the sender, we have the notion of each list of particles being updated based  on bit vectors, being prepared in parallel. The first bit is dedicated to signalling when the image-vector is ready by say 0, shortly changed to 1 and kept at 1 till the image is used again. The receiver reads the corresponding entangled bits by local entanglement in parallel. So a 00011010 is parsed (00)(01)(10)(10). If second 1 means previous right, then first bit is ignored, rest reads 000. Local entanglement is used to read values synchronously without losing any global entanglement and assuming that entangled particles have random values at some maximum rate slower than the cycling rate of lists! The bit vector read at sender is same as one sent.

The entangled bits are not required to be constant, only constant to a rate and can flip random slower. This is because we only report the particle being right or wrong, the values do not change before being read.

More complex protocols can achieve various error correcting codes.

Light is needed! Even though the constituents are entangled across space, IT IS THE IMAGE COMMUNICATED that determines the different orders. The image can be assembled in parallel, but is communicated only at light speed at most! If there are only 1 bit per image then light speed is the best possible!

Entanglement death ESD also suffers from rebirth and allows control. These complications is fertile ground for Entanglement engineers of the future and have no fundamentally impossibility implications.

Thursday, May 2, 2019

Using entanglement to communicate faster than light









latest version

Sent to many including great friend Lov Grover.

It is known that entanglement is at least 10000 times faster than vacuum light! Bell violation extends over 12 hours!

Every one says my goals not possible.



The intuition is as follows -


0. The problem is that once you modify by read or write, entanglement is lost

1. All changes due to relative velocities of source with destination and the estimation steps are straight forward. If used on ON-EARTH, EARTH-MARS AND EARTH-MOON - LINKS ARE MUCH SIMPLER being an astronomical problem. We assume single entanglement of, say spin, in what follows.

2. There are at least 3 entangled particles A & B & C per bit sent with values known at sender and receiver. Unread particles may be duplicated preserving entanglement in some cases. Some how initial entangled pair of atoms/photons are transported (under light-speed shipping) Then the entangled particles are locally duplicated so that handling losses are limited to sacrifices of three copies per bit.

3. One bit can be obtained from the 1st particle A. The experiment to determine its value destroys the local entanglement partner but value same at sender and receiver. It is either right or wrong wrt (with respect to) bit to be sent.

4. We never encode the value! Instead we encode whether the value of the first particle is right or not wrt the bit to be sent! If right, B is sent next.  If not, second is C and is sent.

5. This process will require many light steps. But we can simultaneously send many bits, making the average faster than light speed per bit. In fact, an image of 100,000 or more is collected and sent at once.

6. The 0, 1 values are replaced by recognition of which particle was sent done by value in second bit, obtained by experiment, hence destroys entanglement but of a copy.

7. The problem has been attempted many times by very smart people, so could be wrong. Why, I need to know!

8. Only experiments on local entangled copies need to be done, so the globally entangled bits can be reused. In fact, the only need for A arises to get its entangled instantaneous value, determines B and C values. That may be some random value. As many entangled bits per image, times 3, is what this proposal needs.

9. What is different? Rather than read or write, we use only relational expression!

10. Standard error-correction and boosting machinery can be used for entanglement death. Periodic under-light-speed update is necessary to refresh of dead particles beyond particles kept in storage.

Tuesday, April 30, 2019

Using entanglement to communicate faster than light

Intuition on why entanglement may permit communication faster than light

 Intuition on why entanglement may not permit communication faster than light.

THE FOLLOWING INTERPRETATION OF THE NO GO THEOREM INDICATES THE ESSENTIAL REASON OF THE NO GO IMPOSSIBILITY -

A particle in entanglement shows correlated properties at the places where its entangled partners are, but any deliberate change in the state of particle causes it to lose entanglement! Any experimental determination of the state of particle destroys subsequent entanglement.

How else can one reuse entanglement ?

To keep things simple
1. Same is used to indicate removal of perfect correlation between entangled pairs
2. Setting up by some number of all 1's etc
3. New bits can be dynamically entangled. Initially some entangled bits transported slower than light.

Let us assume that it is somehow possible to detect if a particle is encrypted or not. One can make many copies of entangled particle, sacrificing one copy per bit whenever experimented, hence does photon had the right value. So a pair+ is sacrificed per bit - one to get the value and one+ repeatedly killed to state the correctness or not of the data bit!

Detection of entangled-or-not is done comparing a pair of  remote entangled photons, either same or not. This sacrifice can be done faster than light The killed pair result may be wrong, then correctness sent by another pair and so on. This sequence terminates as bad cases are halved every time! In the rare case of no verification in some m steps, all attempts fail and next bit of bitstream is used, as if repeat of this bit was made. So the encryption-sacrifice returns yes or maybe (chance <= 1/2^m).

Above is not practical as many steps require very fast computation and sequentiality.

How to make it practical

This algorithm is likely slow wrt light. But one can make very large number of copies to send in parallel!

Monday, April 15, 2019

Intuition on why wavelet quantum mechanics is better in many places



Usual QM is Fourier analysis in all places. Signal processing indicates there will be subtle differences. QM is correct about the frequency, but not the time of the wave when signal is changing where that frequency occurs! Following is instructive –




Both have similar time-free spectrum, thus being quantum undistinguishable. This does not happen with Morlet wavelet!

Sunday, April 14, 2019

Universe evolution

Latest link

WHY DO I CONSIDER THE FOLLOWING MY GREATEST ACHIEVEMENT SINCE NEAR DEATH 30 YEARS AGO? It is the final nail in the coffin of God, even God-of-the gaps! It shuts up the skeptic who asks, "but who lit the match at big bang"

The Cyclic Model of Turok was developed based on the three intuitive notions:
 • the big bang is not a beginning of time, but rather a transition to an earlier phase of evolution;
• the evolution of the universe is cyclic;
• the key events that shaped the large scale structure of the universe occurred during a phase of slow contraction before the bang, rather than a period of rapid expansion (inflation) after the bang.

This extends, believing that this edition will live on forever.

Like the master Darvin arguing for Life based on natural selection based evolution! More than all, only universe-evolution (falsifiable like molecular evolution). My belief is contraction/expansion at big bang the only way to evolve the universe - very slow but time may be infinite! Genius of Darwin was in enunciation of natural selection fundamental to evolution! Contraction/expansion allow natural selection of universes

There are two cosmogonic sets (explained) this article opposes, even while accepting scientific  cosmology - any-religion and all extant cosmogonies (science narratives of how universe came to be). Easy is dismissal of all religion-based cosmogonies on the principal of ignorance - pointless to argue with subanimals. Only allowed are those consistent with astronomy extended to cosmogonies.

Superiority to all extant cosmogonies follows from the very simple lack (must explain) of any physics principle without infinite regress! Where do ANY rules of our cosmos come from? Is that process consistent to every rule of cosmology, even those not yet understood! Every religion falls flat on their description of the universe - they are simply analysis of limited observations by excellent brains of the period - with no useful path-finding for scientists of the future! I set a possible disproof and am a path-finder!

Science is falsifiable , says great philosopher Popper. Next I define truths provably inaccessible to science. If true, these MUST extend science! They are true but provably impossible to test ! In my words, falsifiability is extended by sentences true in limits even not reachable.

COSMOGENIES EXISTING

uniting the very small and very large indirect proof of big bang

impossible physics 

NONE OF THE COSMOGENIES HAVE ANY IDEA OF genesis and applicability of the science principles! Mine is different in that rules is the starting point! It assumes infinite numbers of universes in the multiverse, each with origins infinitely earlier. Each universe has infinite number of expansions and contractions, each pair resulting in a subtly different universe in the idea of universe evolution! Each universe may or maynot have intelligent life in it. Our universe has a current epoch with a set of rules conducive to the production of intelligent life - us. It has all the science rules from the evolution - their must be a series of rules that show current rules as either dependent on other basicer rules.

There is a justification of adoption of history of rules. Each derivation is falsifiable from the previous. Not only my thinking provide a basis of work, it provides a path to new theories. Evolution of life on earth provides a wonderful collage of theories with multiple epoch - first Darwinian and in-progress extension to molecular evolution expressed as phylogenetics trees. This happens as DNA of any biology is codewords on 4 letters! One can measure distances in many ways, specially by incorporating molecular substitution costs.

This is a basically better (than darwinian) because a missing link need not exist between between near molecules that can be shown near by creating intermediates in a lab and then argue of the climates and earth state of any missing link animals.

Universe evolution is caused not by any God, but the amount of dark energy created between cycles.What happens at contraction? The information of the contracting part is lost through quantum particles and the universe contracts to a minimal size of the maximally dense material possible in space! After that, universe explodes again. New edition has different dark energy and primarily different in space related constants and time constants. All others have geometric relation to free constants that are reset. This is a falsifiable claim! In particular, it appears that Planck length, speed of light, Higgs constant and vacuum energy (dark) are reset.

Critical falsifiability says that all but space principles evolve as does Planck length and velocity of light. Note that smallest time is light in free space travelling Planck length.

Even Heisenberg uncertainty principle is simple uncertainty in all Fourier analysis in product of complementary variables! Complementary to position is velocity in position and time signal systems!
Trigonometric functions are BETTER done as Morlet wavelets in newer better Quantum theory being written! Superiority over usual QM is that Gabor analysis works even with spatial and time domains unlike Schrodinger equation only spatial (not even relativistic).

Why dark energy?

It is the energy of space. It does not dilute with expanding space! It is simply set at creation cycle event!

Why assume fluctuation?

Whether a universe expand forever or contracts, depends on dark energy! A universe lives on till it reaches a perpetually expanding edition.

Source of science rules?

Universal evolution. All needed for current epoch of universe is one series leading to us, or alternately limiting to no rules at all.

Each universe cycle must be very specific?

Not really! Whether a UE series makes sense or not depends on every term being different in dark energy, for all that happens in each cycle is calibration of the space.

What drives thinking so?

Error-correcting codes for quantum computers are already in quantum space! Quantum space is just space with quantum rules. Matter is just Higgs Boson-trapped photons! Real  vacuum is just space alive with quantum particles with fleeting matter and antagonists. This is due to creation of space at big bang point in every cycle. It need not be same in every universe iteration.

What is this cosmogony based on?


A discrete space-time leads to string theory or loop quantum gravity LQG.

A big bounce scenario is natural in LQG.

Bounce is the only Universe evolution method for science rules! It is the

No graviton or basic dark matter has been found. Verlinde imagines correctly  no such thing as dark-matter. His string theory thinking modifies galactic rotation curves correctly and Einstein bending of light around galaxies! I (minority) think Verlinde is right.

Gravity is not a quantum force, hence no graviton (never found)! It is not a product of big bang. Neither time or space is fundamental  (hence follows space-time as per Einstein) and follow from entropic gravity.

There is no dark energy either, it is vacuum quantum energy. Galaxies far away accelate away faster due to  QVE.

Why make the massive claim of new better cosmogony?

Only existing that explains science rules by universal evolution, eliminates dark energy and dark matter through Dr. Verlinde and explains quantum vacuum with natural quantum error-correction codes .

Last ref indicates why now is the earliest above could be written!

What is the falsifiability here?

Only universe evolutions which are different in space terms are correct. All other constants must be expressible in terms of geometry and varying space constants. Higgs value and vacuum strength are likely from space evolution. Any amount of universal fine tuning is likely.

Where are they?

It is sheer vanity to consider humans as only intelligent species. It is also unbreakable physics that makes a no go theorem of light speed. Wherever they are, they stay far away. NO travel faster than light speed is possible -  all mass things require infinite mass at light speed. Any normal information transfer is impossible by a no go theorem - intuition is simple even though entanglement is faster than light, two entangled objects share a state, but ANY unnatural effort at changing the state of one kills the entanglement!

Particles

It ia natural for us to consider everything space-like since it evolves. Thus idea of particles as creation time back holes is natural. Studied in some depth here. Only some are stable, others are created in very energetic collisions ans rapidly evaporate into photons. Black holes are distinct based on when created, stellar or creation time.

Using encryption to communicate faster than light

Impractical but doable.